Spaceship Titanic - Machine Learning Competition¶
ELEN4025: Group Project
Group Members
- Muaawiyah Dadabhay - (https://www.kaggle.com/muaawiyahdadabhay) - 2426234
- Muhammed Raees Dindar - (https://www.kaggle.com/muhammadraeesdindar) - 2453739
- Taahir Kolia - (www.kaggle.com/taahirkolia) - 2423748
- Irfaan Mia - (https://www.kaggle.com/irfaanmia) - 2434204
Introduction¶
In this notebook, five machine-learning models are employed to predict whether passengers were transported to an alternate dimension, utilizing data frames sourced from Kaggle. The models utilized include Logistic Regression, Random Forest Classification, Naïve Bayes, XCB Classification, and CatBoost Classification.
Kaggle stands as a prominent data science platform renowned for hosting competitions and challenges in the field. It boasts a vibrant community comprising data scientists, statisticians, and machine learning experts who actively contribute their expertise to various projects.
For the analysis at hand, the Spaceship Titanic dataset available on Kaggle is utilized. The central objective of this study is to determine which passengers were transported to an alternate dimension following a spaceship collision.
The methodology involves an initial phase of data analysis, followed by feature engineering to facilitate model construction. Subsequently, the models are trained and evaluated using the provided dataset. Ultimately, comprehensive assessments are conducted on all four models to identify the optimal approach for predicting passenger transportation to alternate dimensions.
It is noteworthy to mention that the development of Python code was facilitated with the assistance of GitHub Copilot, an AI tool designed to aid programmers in streamlining code. Additionally, comprehension of classifier documentation was augmented through the use of Bing AI, which provided valuable insights into understanding the intricacies of classifier methodologies.
Importing of Python Libraries¶
#General imports
import numpy as np
import pandas as pd
import statsmodels.api as sm
import seaborn as sns
import matplotlib.pyplot as plt
import plotly.express as px
from tqdm.auto import tqdm
#Analysis imports
import sweetviz as sw
import matplotlib.pyplot as plt
#Imputator and encoder imports
from sklearn.preprocessing import LabelEncoder
from sklearn.impute import SimpleImputer,KNNImputer
from sklearn.preprocessing import StandardScaler
from sklearn.preprocessing import OneHotEncoder
#Evaluation imports
from sklearn.model_selection import train_test_split, cross_val_score
from sklearn.metrics import confusion_matrix, accuracy_score, mean_squared_error, r2_score, roc_auc_score, roc_curve, classification_report
from sklearn.compose import ColumnTransformer
from sklearn.pipeline import make_pipeline
#Classifiers imports
from sklearn.linear_model import LogisticRegression
from sklearn.neighbors import KNeighborsClassifier
from sklearn.svm import SVC
from sklearn.neural_network import MLPClassifier
from sklearn.tree import DecisionTreeClassifier
from sklearn.ensemble import RandomForestClassifier
from sklearn.ensemble import GradientBoostingClassifier
from sklearn.model_selection import StratifiedKFold
from sklearn.ensemble import ExtraTreesClassifier
from sklearn.discriminant_analysis import LinearDiscriminantAnalysis
from catboost import CatBoostClassifier
from xgboost import XGBClassifier
from lightgbm import LGBMClassifier
# Feature elimination imports
from sklearn.feature_selection import RFE
from sklearn.inspection import permutation_importance
#Hyperparameter tuning imports
from sklearn.model_selection import GridSearchCV
import optuna
from sklearn.model_selection import RandomizedSearchCV
from scipy.stats import uniform, randint
from sklearn.model_selection import cross_validate
from sklearn.metrics import make_scorer, accuracy_score
#I dont know just something it was recommended not sure why
import warnings
warnings.filterwarnings("ignore")
1. Gathering and Understanding Data¶
The focus of this Section is on collecting and comprehending the dataset. The process begins with loading the data from a CSV file in Section 1.1. Following this, Section 1.2 entails a thorough examination of the dataset to gain insights into its structure and contents. Subsequently, Section 1.3 delves deeper into understanding the significance and characteristics of the data at hand. Detecting and handling duplicate values within the dataset is discussed in Section 1.4, while Section 1.5 addresses the identification and treatment of missing values to ensure data integrity and completeness.
1.1 Loading Data from a CSV¶
test_df = pd.read_csv("../data/test.csv")
train_df = pd.read_csv("../data/train.csv")
1.2 Viewing of the Data¶
test_df.head(3)
| PassengerId | HomePlanet | CryoSleep | Cabin | Destination | Age | VIP | RoomService | FoodCourt | ShoppingMall | Spa | VRDeck | Name | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0013_01 | Earth | True | G/3/S | TRAPPIST-1e | 27.0 | False | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | Nelly Carsoning |
| 1 | 0018_01 | Earth | False | F/4/S | TRAPPIST-1e | 19.0 | False | 0.0 | 9.0 | 0.0 | 2823.0 | 0.0 | Lerome Peckers |
| 2 | 0019_01 | Europa | True | C/0/S | 55 Cancri e | 31.0 | False | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | Sabih Unhearfus |
train_df.head(3)
| PassengerId | HomePlanet | CryoSleep | Cabin | Destination | Age | VIP | RoomService | FoodCourt | ShoppingMall | Spa | VRDeck | Name | Transported | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0001_01 | Europa | False | B/0/P | TRAPPIST-1e | 39.0 | False | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | Maham Ofracculy | False |
| 1 | 0002_01 | Earth | False | F/0/S | TRAPPIST-1e | 24.0 | False | 109.0 | 9.0 | 25.0 | 549.0 | 44.0 | Juanna Vines | True |
| 2 | 0003_01 | Europa | False | A/0/S | TRAPPIST-1e | 58.0 | True | 43.0 | 3576.0 | 0.0 | 6715.0 | 49.0 | Altark Susent | False |
1.3 Understanding the Data at Hand¶
test_df.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 4277 entries, 0 to 4276 Data columns (total 13 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 PassengerId 4277 non-null object 1 HomePlanet 4190 non-null object 2 CryoSleep 4184 non-null object 3 Cabin 4177 non-null object 4 Destination 4185 non-null object 5 Age 4186 non-null float64 6 VIP 4184 non-null object 7 RoomService 4195 non-null float64 8 FoodCourt 4171 non-null float64 9 ShoppingMall 4179 non-null float64 10 Spa 4176 non-null float64 11 VRDeck 4197 non-null float64 12 Name 4183 non-null object dtypes: float64(6), object(7) memory usage: 434.5+ KB
train_df.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 8693 entries, 0 to 8692 Data columns (total 14 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 PassengerId 8693 non-null object 1 HomePlanet 8492 non-null object 2 CryoSleep 8476 non-null object 3 Cabin 8494 non-null object 4 Destination 8511 non-null object 5 Age 8514 non-null float64 6 VIP 8490 non-null object 7 RoomService 8512 non-null float64 8 FoodCourt 8510 non-null float64 9 ShoppingMall 8485 non-null float64 10 Spa 8510 non-null float64 11 VRDeck 8505 non-null float64 12 Name 8493 non-null object 13 Transported 8693 non-null bool dtypes: bool(1), float64(6), object(7) memory usage: 891.5+ KB
Categorical data, including HomePlanet, CryoSleep, Destination, and VIP, is observed in the dataframes. These features will undergo transformation into numerical representations for further analysis.
1.4 Determining Duplicate Values¶
print(f"The total duplicates in the test dataframe are: {test_df.duplicated().sum()}")
print(f"The total duplicates in the train dataframe are: {train_df.duplicated().sum()}")
The total duplicates in the test dataframe are: 0 The total duplicates in the train dataframe are: 0
1.5 Determining Missing Values¶
print("Test Dataframes Missing Values")
test_df.isna().sum()
Test Dataframes Missing Values
PassengerId 0 HomePlanet 87 CryoSleep 93 Cabin 100 Destination 92 Age 91 VIP 93 RoomService 82 FoodCourt 106 ShoppingMall 98 Spa 101 VRDeck 80 Name 94 dtype: int64
print("Train Dataframes Missing Values")
train_df.isna().sum()
Train Dataframes Missing Values
PassengerId 0 HomePlanet 201 CryoSleep 217 Cabin 199 Destination 182 Age 179 VIP 203 RoomService 181 FoodCourt 183 ShoppingMall 208 Spa 183 VRDeck 188 Name 200 Transported 0 dtype: int64
Both dataframes contain missing values, necessitating data manipulation. Initially, missing values will be replaced with NaN. Subsequently, dataframes will be merged for efficient filling of values and separated as needed in subsequent stages.
combined_df = pd.concat([train_df, test_df], ignore_index=True)
temp_combined_df = combined_df.copy()
combined_df.replace('', np.NaN, inplace=True)
combined_df.fillna(np.NaN, inplace=True)
combined_df.tail(3)
| PassengerId | HomePlanet | CryoSleep | Cabin | Destination | Age | VIP | RoomService | FoodCourt | ShoppingMall | Spa | VRDeck | Name | Transported | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 12967 | 9271_01 | Mars | True | D/296/P | 55 Cancri e | NaN | False | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | Jayrin Pore | NaN |
| 12968 | 9273_01 | Europa | False | D/297/P | NaN | NaN | False | 0.0 | 2680.0 | 0.0 | 0.0 | 523.0 | Kitakan Conale | NaN |
| 12969 | 9277_01 | Earth | True | G/1498/S | PSO J318.5-22 | 43.0 | False | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | Lilace Leonzaley | NaN |
2. Data Analysis¶
In this section, the dataset undergoes a comprehensive analysis to gain insights into its categorical and numerical features. Section 2.1 focuses on the examination and exploration of categorical features. Following this, in Section 2.2, an analysis of numerical features is conducted.
# Figure size
plt.figure(figsize=(6,6));
# Pie plot
train_df['Transported'].value_counts().plot.pie(explode=[0.1,0.1], autopct='%1.1f%%', shadow=True, textprops={'fontsize':16}).set_title("Target distribution");
The target feature, Transported, indicates whether passengers were transported to another dimension, with approximately half being transported.
2.1 Categorical Feature Analysis¶
The categorical features are analysed to determine how the target feature depends on the categorical features.
# Categorical features
cat_feats=['HomePlanet', 'CryoSleep', 'Destination', 'VIP']
# Convert boolean values to strings ('Yes' and 'No')
temp_combined_df['Transported'] = combined_df['Transported'].map({True: 'Yes', False: 'No'})
# Plot categorical features
fig = plt.figure(figsize=(10, 16))
for i, var_name in enumerate(cat_feats):
ax = fig.add_subplot(4, 1, i + 1)
sns.countplot(data=temp_combined_df, x=var_name, ax=ax, hue='Transported')
ax.set_title(var_name)
fig.tight_layout()
plt.show()
The relationship between categorical features and "Transported" is explored. While no direct relationship is found with HomePlanet,Destination, and VIP, CryoSleep appears correlated, indicating a higher likelihood of transportation for those in CryoSleep. [1]
2.2 Numerical Feature Analysis¶
The numerical features are analysed to determine how the target feature depends on the numerical features.
# Create a copy of the DataFrame
combined_df_copy = combined_df.replace({True: 1, False: 0})
# Drop NaN values from the DataFrame
combined_df_copy = combined_df_copy.dropna()
# Create a heatmap of correlations for numerical columns
sns.heatmap(combined_df_copy.select_dtypes(include=np.number).corr(), vmin=0, vmax=1, cmap=plt.cm.Blues, annot=True)
# Display the heatmap
plt.show()
The heatmap above highlights CryoSleep as having the highest correlation among the Transported feature, confirming the observations discussed in Section 2.1. It's noteworthy that generating an SNS plot typically requires numerical values; hence, temporary feature engineering was applied to facilitate this analysis.
sns.pairplot(combined_df_copy, hue = 'Transported');
# Numerical features
num_feats=['Age', 'RoomService', 'FoodCourt', 'ShoppingMall', 'Spa', 'VRDeck']
# Plot numerical features
fig=plt.figure(figsize=(10,20))
for i, var_name in enumerate(num_feats):
# Left plot
ax=fig.add_subplot(6,2,2*i+1)
sns.histplot(data=combined_df, x=var_name, axes=ax, bins=30, kde=False, hue='Transported')
if(i!=0):
ax.set_title(var_name + " Expenditure")
# Right plot (truncated)
ax=fig.add_subplot(6,2,2*i+2)
sns.histplot(data=combined_df, x=var_name, axes=ax, bins=30, kde=True, hue='Transported')
if(i!=0):
plt.ylim([0,100])
ax.set_title(var_name + " Expenditure")
fig.tight_layout() # Improves appearance a bit
plt.show()
Most passengers did not spend money, and those spending less were more likely to be transported. This observation strengthens the relationship between CryoSleep and Transportation, as individuals in CryoSleep likely didn't spend. To make it easier for analysing all individial expense features are grouped into a singular expense feature.
expense_feature = ['RoomService','FoodCourt','Spa','VRDeck','ShoppingMall']
combined_df['TotalExpenditure'] = combined_df.loc[:,expense_feature].sum(axis=1)
temp_combined_df['TotalExpenditure'] = temp_combined_df.loc[:,expense_feature].sum(axis=1)
combined_df.head(3)
| PassengerId | HomePlanet | CryoSleep | Cabin | Destination | Age | VIP | RoomService | FoodCourt | ShoppingMall | Spa | VRDeck | Name | Transported | TotalExpenditure | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0001_01 | Europa | False | B/0/P | TRAPPIST-1e | 39.0 | False | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | Maham Ofracculy | False | 0.0 |
| 1 | 0002_01 | Earth | False | F/0/S | TRAPPIST-1e | 24.0 | False | 109.0 | 9.0 | 25.0 | 549.0 | 44.0 | Juanna Vines | True | 736.0 |
| 2 | 0003_01 | Europa | False | A/0/S | TRAPPIST-1e | 58.0 | True | 43.0 | 3576.0 | 0.0 | 6715.0 | 49.0 | Altark Susent | False | 10383.0 |
combined_df.groupby('HomePlanet')['VIP'].mean()
HomePlanet Earth 0.0 Europa 0.057843 Mars 0.034312 Name: VIP, dtype: object
combined_df.groupby('HomePlanet')['Transported'].mean()
HomePlanet Earth 0.423946 Europa 0.658846 Mars 0.523024 Name: Transported, dtype: object
combined_df.groupby('Destination')['Transported'].mean()
Destination 55 Cancri e 0.61 PSO J318.5-22 0.503769 TRAPPIST-1e 0.471175 Name: Transported, dtype: object
combined_df.groupby('CryoSleep')['Transported'].mean()
CryoSleep False 0.328921 True 0.817583 Name: Transported, dtype: object
combined_df.groupby('VIP')['TotalExpenditure'].mean()
VIP False 1358.276510 True 4595.542125 Name: TotalExpenditure, dtype: float64
# Display box and wshisker diagrams for each numerical feature
num_variables = len(num_feats)
nrows = num_variables // 2
ncols = 2
fig, axes = plt.subplots(nrows=nrows, ncols=ncols, figsize=(25, 25))
for i, var in enumerate(num_feats):
row = i // ncols
col = i % ncols
ax = axes[row, col]
combined_df[var].plot(kind='box', ax=ax)
ax.set_title(var)
plt.tight_layout()
plt.show()
Outliers are noted above, particularly in expenditure-related features. From the groupby functions certain relationships could be obtained. Such as there are no VIPs from Earth. If the person is in CryoSleep there is a high chance of being transported and if a person is a VIP they have a higher chance to have a higher expenditure.
3. Feature Engineering¶
Within this section, various techniques are employed to preprocess and engineer features. Beginning with Section 3.1, features are adjusted to ensure alignment with analysis requirements. Subsequently, in Section 3.2, missing data entries are addressed through appropriate imputation methods. Section 3.3 focuses on scaling numerical data to ensure uniformity in magnitudes. Categorical variables are then encoded using one-hot encoding techniques, detailed in Section 3.4. Finally, the dataset is split into training and test sets in Section 3.5 to facilitate effective model training and evaluation. Additionally, strategies for feature selection and multicollinearity mitigation are explored to optimize model performance.
The PassengerId and Cabin locations contain multiple forms of information that can be extracted into additional features. The Cabin data is typically presented in the format deck/num/side. Hence, it is proposed to decompose the cabin feature into three separate features. Additionally, the PassengerId conventionally adopts the structure gggg _ pp, where gggg denotes the group number and pp represents a passenger's number within a group. By parsing the group numbers from the PassengerId, an additional feature can be derived.
combined_df['Group'] = combined_df['PassengerId'].astype(str).str[:4]
counts = combined_df['Group'].value_counts()
combined_df['Group'] = combined_df['Group'].map(counts)
combined_df[['Deck', 'Number', 'Side']] = combined_df['Cabin'].str.split('/', expand=True)
combined_df.head(5)
| PassengerId | HomePlanet | CryoSleep | Cabin | Destination | Age | VIP | RoomService | FoodCourt | ShoppingMall | Spa | VRDeck | Name | Transported | TotalExpenditure | Group | Deck | Number | Side | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0001_01 | Europa | False | B/0/P | TRAPPIST-1e | 39.0 | False | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | Maham Ofracculy | False | 0.0 | 1 | B | 0 | P |
| 1 | 0002_01 | Earth | False | F/0/S | TRAPPIST-1e | 24.0 | False | 109.0 | 9.0 | 25.0 | 549.0 | 44.0 | Juanna Vines | True | 736.0 | 1 | F | 0 | S |
| 2 | 0003_01 | Europa | False | A/0/S | TRAPPIST-1e | 58.0 | True | 43.0 | 3576.0 | 0.0 | 6715.0 | 49.0 | Altark Susent | False | 10383.0 | 2 | A | 0 | S |
| 3 | 0003_02 | Europa | False | A/0/S | TRAPPIST-1e | 33.0 | False | 0.0 | 1283.0 | 371.0 | 3329.0 | 193.0 | Solam Susent | False | 5176.0 | 2 | A | 0 | S |
| 4 | 0004_01 | Earth | False | F/1/S | TRAPPIST-1e | 16.0 | False | 303.0 | 70.0 | 151.0 | 565.0 | 2.0 | Willy Santantines | True | 1091.0 | 1 | F | 1 | S |
3.1.2 Age¶
Based on the data analysis conducted in Section 2, it is evident that the age range of passengers spans 79 years. This broad spectrum of values in its current numerical state lacks substantial interpretive value. However, this limitation can be mitigated by categorizing the data into age groups, thereby facilitating the identification of trends among different age cohorts. Additionally, the incorporation of an age group feature is anticipated to significantly alleviate model complexity.
By aggregating passengers into distinct age brackets, it becomes feasible to explore the relationship between age groups and pertinent variables such as Transported status and TotalExpenditure. This segmentation approach allows for a more nuanced analysis of how age influences both the likelihood of transportation and expenditure patterns.
sns.histplot(data=combined_df, x='Age', hue='Transported', element='step', kde=True,binwidth=5)
plt.title('Histogram of Age by Survived')
plt.xlabel('Age')
plt.ylabel('Count')
plt.show()
combined_df['AgeGroup']=np.nan
combined_df.loc[combined_df['Age']<=12,'AgeGroup']='1'
combined_df.loc[(combined_df['Age']>12) & (combined_df['Age']<18),'AgeGroup']='2'
combined_df.loc[(combined_df['Age']>=18) & (combined_df['Age']<=25),'AgeGroup']='3'
combined_df.loc[(combined_df['Age']>25) & (combined_df['Age']<=30),'AgeGroup']='4'
combined_df.loc[(combined_df['Age']>30) & (combined_df['Age']<=50),'AgeGroup']='5'
combined_df.loc[combined_df['Age']>50,'AgeGroup']='6'
combined_df.groupby('AgeGroup')['Transported'].mean()
AgeGroup 1 0.699752 2 0.553451 3 0.458103 4 0.496272 5 0.479432 6 0.484396 Name: Transported, dtype: object
combined_df.groupby('AgeGroup')['TotalExpenditure'].mean().fillna(0)
AgeGroup 1 0.000000 2 864.158085 3 1109.285911 4 1939.196378 5 1946.321492 6 1941.712022 Name: TotalExpenditure, dtype: float64
From the above relationships, two observations emerge:
- Individuals belonging to the initial AgeGroup (0-12 years old) exhibit no expenditure.
- Individuals within the first AgeGroup (0-12 years old) demonstrate the highest likelihood of transportation.
3.2 Filling Missing Data Entries¶
# Rearrange the order of the columns so that numerical columns are grouped together, categorical columns are grouped together,
# and the target feature is at the end
num_cols = ['ShoppingMall','FoodCourt','RoomService','Spa','VRDeck','TotalExpenditure','Age']
cat_cols = ['CryoSleep','Deck','Side','VIP','HomePlanet','Destination',"Group", "AgeGroup"]
transported=['Transported']
combined_df = combined_df[num_cols+cat_cols+transported].copy()
# Print the count of the total number of missing data entries exluding the target feature
print("Number of missing data entries:", (combined_df.isna().sum().sum() - combined_df["Transported"].isna().sum().sum()))
Number of missing data entries: 3716
When addressing missing data entries, it is imperative to discern the underlying pattern of their absence, categorizing them as Missing Completely at Random (MCAR), Missing at Random (MAR), or Missing Not at Random (MNAR) [2]. Such classification is pivotal in determining suitable imputation methods for filling the voids. Upon scrutiny of the data, it becomes apparent that certain missing data may adhere to the MNAR pattern, as inferred from their associations, while the remaining missing values may adhere to the MAR pattern.
The observed relationships are as follows:
- Individuals in CryoSleep exhibit no expenditure.
- There are no VIP passengers hailing from Earth.
Leveraging these insights, missing expenditure features for passengers in CryoSleep can be substituted with zero values. Additionally, missing data entries are addressed where either the VIP status is confirmed as False but the HomePlanet is absent, or when the HomePlanet is identified as Earth but the VIP status is unknown. Remaining missing values within these categories are filled using the methods delineated below.
Simply removing rows or columns with missing entries is considered, albeit this approach is deemed unsuitable due to the substantial volume of missing data entries - 3716 in total. Such indiscriminate removal could significantly diminish the available data, thereby undermining the analysis. Consequently, this approach is deemed suboptimal for addressing missing data.
Alternatively, employing a Simple Imputer to replace missing numerical feature data with their mean and categorical feature data with their mode is proposed [2]. While this preserves the total number of entries, it may introduce slight inaccuracies and biases into the data. Notably, as observed from the data analysis in Section 2, numerous outliers are identified within the numerical features. These outliers can distort the data and potentially compromise the accuracy of imputed values. Consequently, this method is preferred for addressing missing entries within categorical features.
Furthermore, consideration is given to the K-Nearest-Neighbors imputation method, which imputes missing data entries with the mean value of the K nearest neighbors [3]. This method is anticipated to offer more accurate imputations compared to the Simple Imputer approach. As such, it is employed for filling missing entries within numerical features.
combined_df.isna().sum()
ShoppingMall 306 FoodCourt 289 RoomService 263 Spa 284 VRDeck 268 TotalExpenditure 0 Age 270 CryoSleep 310 Deck 299 Side 299 VIP 296 HomePlanet 288 Destination 274 Group 0 AgeGroup 270 Transported 4277 dtype: int64
# Function to set expense features to 0 if CryoSleep is True
def adjust_expenses(row):
if row['CryoSleep'] == True:
row[expense_feature] = 0
row['TotalExpenditure'] = 0
return row
# Apply the function to the DataFrame
combined_df = combined_df.apply(adjust_expenses, axis=1)
# Update the 'CryoSleep' column based on the 'TotalExpenditure' column
combined_df.loc[:,['CryoSleep']]=combined_df.apply(lambda x: True if x.TotalExpenditure == 0 and pd.isna(x.CryoSleep) else x,axis =1)
# Update the 'VIP' column based on the 'HomePlanet' column
combined_df.loc[:,'VIP'] = combined_df.apply(lambda x: False if x.HomePlanet == "Earth" and pd.isna(x.VIP) else x, axis = 1)
# Update the 'HomePlanet' column based on the 'VIP' column
combined_df.loc[:,'HomePlanet'] = combined_df.apply(lambda x: "Earth" if x.VIP == False and pd.isna(x.HomePlanet) else x, axis = 1)
# Update the 'TotalExpenditure' column based on the 'AgeGroup' column
def adjust_expenses(row):
if row['AgeGroup'] == '1':
row[expense_feature] = 0
row['TotalExpenditure'] = 0
return row
# Apply the function to the DataFrame
combined_df = combined_df.apply(adjust_expenses, axis=1)
combined_df.isna().sum()
ShoppingMall 163 FoodCourt 171 RoomService 162 Spa 166 VRDeck 150 TotalExpenditure 0 Age 270 CryoSleep 174 Deck 299 Side 299 VIP 137 HomePlanet 9 Destination 274 Group 0 AgeGroup 270 Transported 4277 dtype: int64
The utilization of identified relationships between specific features allows for the filling of numerous missing data entries with expected values. This observation aligns with the assertion made earlier regarding certain missing values in the dataset adhering to the MNAR pattern.
num_imp = KNNImputer(n_neighbors=10)
cat_imp = SimpleImputer(strategy='most_frequent')
The KNNImputer was utilized over the generally used SimpleImputater for obtain the mean specifically with the numerical data as it provided better results. This is due to the algorithm being more complex allowing for a better approximation to be made.
combined_df[num_cols] = pd.DataFrame(num_imp.fit_transform(combined_df[num_cols]),columns=num_cols)
combined_df[cat_cols] = pd.DataFrame(cat_imp.fit_transform(combined_df[cat_cols]),columns=cat_cols)
# Rearrange the order of the columns so that numerical columns are grouped together, categorical columns are grouped together,
# and the target feature is at the end
num_cols = ['ShoppingMall','FoodCourt','RoomService','Spa','VRDeck','TotalExpenditure','Age']
cat_cols = ['CryoSleep','Deck','Side','VIP','HomePlanet','Destination',"Group","AgeGroup"]
transported=['Transported']
combined_df= combined_df[num_cols+cat_cols+transported].copy()
3.3 Scaling Numerical Data¶
Feature scaling is a crucial aspect of the feature engineering process, involving the standardization of numerical features within a dataset to a specific range. This procedure offers several advantages [4]:
- Prevents Feature Dominance: Features with larger ranges have a tendency to dominate the model, resulting in biased results. Feature scaling prevents this from occuring.
- Improves Algorithm Performance: Many algorithms converge faster when features are scaled. This results in better performance.
- Enhances Numerical Stability: Reduces the range of varying feature scales, thereby reducing the risk of numerical problems.
One common method of scaling features is through the use of StandardScaler. This technique adjusts the mean and standard deviation of each feature to zero and one, respectively, thereby standardizing the distribution of data. Importantly, StandardScaler preserves the overall shape of the data's distribution, ensuring that feature dominance is mitigated during the training process [5].
combined_df[num_cols] = StandardScaler().fit_transform(combined_df[num_cols])
combined_df
| ShoppingMall | FoodCourt | RoomService | Spa | VRDeck | TotalExpenditure | Age | CryoSleep | Deck | Side | VIP | HomePlanet | Destination | Group | AgeGroup | Transported | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | -0.294298 | -0.282930 | -0.342323 | -0.271011 | -0.257880 | -0.510541 | 0.709968 | False | B | P | False | Europa | TRAPPIST-1e | 1 | 5 | False |
| 1 | -0.251502 | -0.277190 | -0.172472 | 0.219782 | -0.220225 | -0.248363 | -0.341680 | False | F | S | False | Earth | TRAPPIST-1e | 1 | 3 | True |
| 2 | -0.294298 | 1.997785 | -0.275318 | 5.732045 | -0.215946 | 3.188082 | 2.042056 | False | A | S | True | Europa | TRAPPIST-1e | 2 | 6 | False |
| 3 | 0.340791 | 0.535347 | -0.342323 | 2.705038 | -0.092712 | 1.333249 | 0.289309 | False | A | S | False | Europa | TRAPPIST-1e | 2 | 5 | False |
| 4 | -0.035811 | -0.238285 | 0.129832 | 0.234086 | -0.256169 | -0.121906 | -0.902559 | False | F | S | False | Earth | TRAPPIST-1e | 1 | 2 | True |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 12965 | -0.294298 | -0.282930 | -0.342323 | -0.271011 | -0.257880 | -0.510541 | 0.359419 | True | G | S | False | Earth | TRAPPIST-1e | 2 | 5 | NaN |
| 12966 | -0.265196 | 0.257273 | -0.342323 | -0.262072 | -0.134646 | -0.147910 | 0.920298 | False | F | S | False | Earth | TRAPPIST-1e | 1 | 5 | NaN |
| 12967 | -0.294298 | -0.282930 | -0.342323 | -0.271011 | -0.257880 | -0.510541 | 0.436540 | True | D | P | False | Mars | 55 Cancri e | 1 | 5 | NaN |
| 12968 | -0.294298 | 1.426330 | -0.342323 | -0.271011 | 0.189699 | 0.630429 | 0.534693 | False | D | P | False | Europa | TRAPPIST-1e | 1 | 5 | NaN |
| 12969 | -0.294298 | -0.282930 | -0.342323 | -0.271011 | -0.257880 | -0.510541 | 0.990408 | True | G | S | False | Earth | PSO J318.5-22 | 1 | 5 | NaN |
12970 rows × 16 columns
3.4 One-Hot Encoding¶
Machine learning algorithms are inherently designed to operate on numerical data [6]. However, categorical features often contain valuable information that needs to be transformed into a numerical format for utilization by these algorithms. One commonly employed technique for this purpose is One Hot Encoding (OHE).
OHE involves analyzing a categorical feature to identify the number of unique variables or types within it, denoted as n. Subsequently, n new binary features are created to represent each unique category, effectively encapsulating the categorical data in a numerical form. This approach effectively overcomes the inherent limitations of machine learning algorithms, which typically require numerical inputs for operation.
ohe = OneHotEncoder (handle_unknown='ignore',sparse_output = False)
temp_train = pd.DataFrame(ohe.fit_transform(combined_df[cat_cols]),columns=ohe.get_feature_names_out())
combined_df = combined_df.drop(cat_cols,axis=1)
combined_df = pd.concat([combined_df,temp_train],axis=1)
3.5 Separating Train and Test Data¶
With the adjustment of feature columns, filling of missing data, and encoding of categorical features completed, the train and test data are now segregated back into their original forms.
train_df = combined_df[combined_df['Transported'].notnull()].copy()
train_df.Transported =train_df.Transported.astype('int')
test_df = combined_df[combined_df['Transported'].isnull()].drop("Transported",axis=1)
X = train_df.drop('Transported',axis=1)
y = train_df["Transported"]
print(X)
print(y)
ShoppingMall FoodCourt RoomService Spa VRDeck \
0 -0.294298 -0.282930 -0.342323 -0.271011 -0.257880
1 -0.251502 -0.277190 -0.172472 0.219782 -0.220225
2 -0.294298 1.997785 -0.275318 5.732045 -0.215946
3 0.340791 0.535347 -0.342323 2.705038 -0.092712
4 -0.035811 -0.238285 0.129832 0.234086 -0.256169
... ... ... ... ... ...
8688 -0.294298 4.066117 -0.342323 1.197793 -0.194552
8689 -0.294298 -0.282930 -0.342323 -0.271011 -0.257880
8690 2.910248 -0.282930 -0.342323 -0.270118 -0.257880
8691 -0.294298 0.386105 -0.342323 0.044562 2.510606
8692 -0.294298 2.707000 -0.145981 -0.271011 -0.247611
TotalExpenditure Age CryoSleep_False CryoSleep_True Deck_A \
0 -0.510541 0.709968 1.0 0.0 0.0
1 -0.248363 -0.341680 1.0 0.0 0.0
2 3.188082 2.042056 1.0 0.0 1.0
3 1.333249 0.289309 1.0 0.0 1.0
4 -0.121906 -0.902559 1.0 0.0 0.0
... ... ... ... ... ...
8688 2.530145 0.850188 1.0 0.0 1.0
8689 -0.510541 -0.762339 0.0 1.0 0.0
8690 0.156658 -0.201460 1.0 0.0 0.0
8691 1.141247 0.219199 1.0 0.0 0.0
8692 1.208572 1.060518 1.0 0.0 0.0
... Group_5 Group_6 Group_7 Group_8 AgeGroup_1 AgeGroup_2 \
0 ... 0.0 0.0 0.0 0.0 0.0 0.0
1 ... 0.0 0.0 0.0 0.0 0.0 0.0
2 ... 0.0 0.0 0.0 0.0 0.0 0.0
3 ... 0.0 0.0 0.0 0.0 0.0 0.0
4 ... 0.0 0.0 0.0 0.0 0.0 1.0
... ... ... ... ... ... ... ...
8688 ... 0.0 0.0 0.0 0.0 0.0 0.0
8689 ... 0.0 0.0 0.0 0.0 0.0 0.0
8690 ... 0.0 0.0 0.0 0.0 0.0 0.0
8691 ... 0.0 0.0 0.0 0.0 0.0 0.0
8692 ... 0.0 0.0 0.0 0.0 0.0 0.0
AgeGroup_3 AgeGroup_4 AgeGroup_5 AgeGroup_6
0 0.0 0.0 1.0 0.0
1 1.0 0.0 0.0 0.0
2 0.0 0.0 0.0 1.0
3 0.0 0.0 1.0 0.0
4 0.0 0.0 0.0 0.0
... ... ... ... ...
8688 0.0 0.0 1.0 0.0
8689 1.0 0.0 0.0 0.0
8690 0.0 1.0 0.0 0.0
8691 0.0 0.0 1.0 0.0
8692 0.0 0.0 1.0 0.0
[8693 rows x 41 columns]
0 0
1 1
2 0
3 0
4 1
..
8688 0
8689 0
8690 1
8691 0
8692 1
Name: Transported, Length: 8693, dtype: int32
3.6 Identifying Correlated Features¶
Identifying the correlation between variables is a crucial step in feature selection, as highly correlated features can adversely affect model performance. High correlations between features can lead to several issues [6]:
- Redundancy of Information: Certain features may not provide new information to a model as the information can be inferred from another feature. Therefore, it does not improve the predictive power of a model. Removing redundant can assist in preventing overfitting in models, thus resulting in better performance on unseen data.
- Increased Complexity of Models: If highly correlated features are not removed, the dimensionality of data is increased which can result in longer computational times.
Interpretation of Data: Having highly correlated features can complicate the interpretability of feature importance, resulting in a lapse in a model’s decision making.
Hence, it is essential to identify and remove correlated features from the dataset. This is achieved by identifying the top ten absolute correlations between different features, excluding correlations between a feature and itself, as this always results in a maximum correlation of 1.0.
def get_redundant_pairs(X):
pairs_to_drop = set()
cols = X.columns
for i in range(0, X.shape[1]):
for j in range(0, i+1):
pairs_to_drop.add((cols[i], cols[j]))
return pairs_to_drop
def get_top_abs_correlations(df, n=1):
au_corr = X.corr().abs().unstack()
labels_to_drop = get_redundant_pairs(X)
au_corr = au_corr.drop(labels=labels_to_drop).sort_values(ascending=False)
return au_corr[0:n]
print("Top Absolute Correlations !")
print(get_top_abs_correlations(train_df.select_dtypes(include=['int32','int64']), 10))
Top Absolute Correlations ! CryoSleep_False CryoSleep_True 1.000000 VIP_False VIP_True 1.000000 Side_P Side_S 1.000000 Destination_55 Cancri e Destination_TRAPPIST-1e 0.783137 FoodCourt TotalExpenditure 0.743796 HomePlanet_Earth HomePlanet_Europa 0.633221 Age AgeGroup_6 0.621302 Spa TotalExpenditure 0.594728 VRDeck TotalExpenditure 0.586090 Deck_G HomePlanet_Earth 0.581275 dtype: float64
Among the top ten most correlated features, three pairs exhibit a maximum correlation of 1.0, suggesting redundancy. These pairs include (CryoSleep_False and CryoSleep_True, VIP_False and VIP_True, Side_P and Side_S), indicating duplicate information. Consequently, these redundant features are slated for removal from the dataset.
Furthermore, additional features such as Destination_TRAPPIST-1e, FoodCourt, Age, and HomePlanet_Earth are also marked for elimination due to their high correlations with other features. These step aims to streamline the dataset and enhance model efficiency by reducing unnecessary redundancy and complexity.
The idea and code was inspired from the XGBClassifier+Optuna notebook [8]
drop_list=["Age",'CryoSleep_True','HomePlanet_Earth', 'VIP_False','FoodCourt','Destination_TRAPPIST-1e', "Side_P"]
X=X.drop(drop_list,axis=1)
train_df=test_df.drop(drop_list,axis=1)
X.columns
Index(['ShoppingMall', 'RoomService', 'Spa', 'VRDeck', 'TotalExpenditure',
'CryoSleep_False', 'Deck_A', 'Deck_B', 'Deck_C', 'Deck_D', 'Deck_E',
'Deck_F', 'Deck_G', 'Deck_T', 'Side_S', 'VIP_True', 'HomePlanet_Europa',
'HomePlanet_Mars', 'Destination_55 Cancri e',
'Destination_PSO J318.5-22', 'Group_1', 'Group_2', 'Group_3', 'Group_4',
'Group_5', 'Group_6', 'Group_7', 'Group_8', 'AgeGroup_1', 'AgeGroup_2',
'AgeGroup_3', 'AgeGroup_4', 'AgeGroup_5', 'AgeGroup_6'],
dtype='object')
4 Models¶
Cross-validation is a valuable technique used to assess model performance on unseen data by resampling data [9]. It helps mitigate overfitting, a common issue where a model performs well on training data but less accurately on unseen data. Various methods of cross-validation can be employed:
Hold Out Cross Validation: This method involves randomly splitting a dataset into separate training and validation subsets. For instance, 70% of the data can be allocated to training, with the remaining 30% used for validation.
Stratified K-Fold Cross Validation: In this approach, the dataset is divided into K subsets of equal size, while maintaining the class ratio and distribution of the entire dataset. Each subset, or fold, is used as a validation set iteratively, while the remaining data serves as the training set. The process is repeated until each fold has been utilized for validation, and the average validation score is computed.
Leave-One-Out Cross Validation: Similar to the generalized K-Fold method, but here, K equals the total number of entries in the dataset. Each data entry is used as a test sample while the remaining data is used for training. However, this approach is computationally expensive, especially for large datasets.
The choice of cross-validation method depends on the specific requirements and characteristics of the dataset. To determine the most suitable method, an estimation of accuracy scores for different models is conducted below.
random_state = 2
classifiers = [SVC(), RandomForestClassifier(), ExtraTreesClassifier(),
GradientBoostingClassifier(), CatBoostClassifier(verbose = False),
XGBClassifier(), LGBMClassifier()]
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.5, random_state=random_state)
results = {}
for name, classifier in zip(["SVC", "RandomForest", "ExtraTrees", "GradientBoosting", "CatBoostClassifier", "XGBClassifier", "LGBMClassifier"], classifiers):
classifier.fit(X_train, y_train)
predictions = classifier.predict(X_test)
accuracy = accuracy_score(y_test, predictions)
results[name] = accuracy
cv_res_holdout = pd.DataFrame(results, index=['Accuracy'])
[LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 2167, number of negative: 2179 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000276 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 4346, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.498619 -> initscore=-0.005522 [LightGBM] [Info] Start training from score -0.005522
cv_res_holdout
| SVC | RandomForest | ExtraTrees | GradientBoosting | CatBoostClassifier | XGBClassifier | LGBMClassifier | |
|---|---|---|---|---|---|---|---|
| Accuracy | 0.795951 | 0.785369 | 0.767426 | 0.795951 | 0.803773 | 0.795721 | 0.804463 |
4.1.2 Stratified K-Fold Cross Validation¶
kfold = StratifiedKFold(n_splits=20, shuffle = True)
# Modeling step Test differents algorithms
random_state = 2
classifiers = [SVC(),RandomForestClassifier(), ExtraTreesClassifier(),GradientBoostingClassifier(),
CatBoostClassifier(verbose = False),
XGBClassifier(), LGBMClassifier()]
cross_validation_results = []
for classifier in classifiers :
cross_validation_results.append(cross_val_score(classifier, X, y = y, scoring = "accuracy", cv = kfold, n_jobs=4))
cross_validation_mean = []
cross_validation_std = []
for cv_result in cross_validation_results:
cross_validation_mean.append(cv_result.mean())
cross_validation_std.append(cv_result.std())
cross_validation_res = pd.DataFrame({"CrossValMeans":cross_validation_mean,"CrossValerrors": cross_validation_std,"Algorithm":["SVC", "RandomForest","ExtraTrees","GradientBoosting", "CatBoostClassifier", "XGBClassifier", "LGBMClassifier"]})
cross_validation_res
| CrossValMeans | CrossValerrors | Algorithm | |
|---|---|---|---|
| 0 | 0.794547 | 0.022424 | SVC |
| 1 | 0.790407 | 0.015215 | RandomForest |
| 2 | 0.774647 | 0.013541 | ExtraTrees |
| 3 | 0.797649 | 0.015764 | GradientBoosting |
| 4 | 0.812148 | 0.015812 | CatBoostClassifier |
| 5 | 0.800988 | 0.025801 | XGBClassifier |
| 6 | 0.805588 | 0.020863 | LGBMClassifier |
Upon evaluating the estimated cross-validation scores, it is apparent that the stratified K-Fold method performs better than the Hold Out encryption method.
Having identified the K-Fold method as the preferred cross-validation method, different fold sizes (5,10,15,20) were tested and a fold size of 20 yielded the best results.
cv = StratifiedKFold(n_splits=20, shuffle = True)
4.2 Feature Selection using LBGM Classifier¶
Classifiers in general are algorithms that organise data as being a part of certain groups or "classes", essentially predicting the class(es) taht a datapoint belongs to [12][13] . The Light Gradient Boosting Classifier (LGBM) is a machine learning algorithm that makes use of decision trees for ranking and classification of data, among other tasks [10]. The classifier makes use of techniques such as Gradient-Based One-Side Sampling (GOSS) and Exclusive feature binding (EFB) among other methods to achieve it's ranking [10][11]. GOSS allows for improved training time and optimized memory usage by retaining instances with large gradients during training [10][11]. EFB allows for a faster training process by combining mutually exclusive features to reduce the number of dimensions of the data [10].
The LGBM Classifier is used here to determine the 15 most important features of the dataset. This allows for the ranking of features and the removal of unimportant features- leading to improved performance and easier understanding of the underlying processes upon which the model operates [copilot]. Using the LGBM classifier has various advantages:
Improves accuracy (when compared to other boosting algorithms) and handles overfitting well in small datasets [14].
Training speeds are increased [15].
Memory usage is decreased [14].
To proceed, the training data is split into separate training and validation sets. The training set is utilized to train the model and determine optimal parameters, while the validation set assesses the model's performance on unseen data, simulating real-world scenarios.
The top 15 features identified will are employed for model training.
%%capture
X_train, X_val, y_train, y_val = train_test_split(X, y, test_size=0.2, random_state=42)
# Train the LightGBM model
model = LGBMClassifier()
model.fit(X, y)
# Get permutation materiality
result = permutation_importance(model, X_val, y_val, scoring="accuracy", n_repeats=100, random_state=42)
sorted_indices = np.argsort(result.importances_mean)[::-1]
# Show the most important features and their respective importance values
top_features = X.columns[sorted_indices[:15]] # 15 Most important feature
top_importances = result.importances_mean[sorted_indices[:15]]
for feature, importance in zip(top_features, top_importances):
print(f"{feature}: {importance}")
top_features
Index(['VRDeck', 'Spa', 'RoomService', 'TotalExpenditure', 'CryoSleep_False',
'Side_S', 'Deck_E', 'ShoppingMall', 'Deck_F', 'Deck_C',
'HomePlanet_Mars', 'Deck_B', 'Destination_55 Cancri e', 'AgeGroup_3',
'AgeGroup_2'],
dtype='object')
# Let's move on with the top 15 features
X_FeatureSelection_LGM = X[top_features]
test_FeatureSelection_LGM = test_df[top_features]
4.3 Feature Selection using RFE¶
%%capture
# Create the RFE object and rank each pixel
# Initialize LightGBM classifier
clf = LGBMClassifier()
# Initialize RFE with LightGBM as the estimator
rfe = RFE(estimator=clf, n_features_to_select=15)
rfe.fit(X, y)
column_names = X.columns
selected_column_names = column_names[rfe.support_]
selected_column_names
Recursive Feature Elimination (RFE) represents a feature selection algorithm aimed at identifying the most significant features contributing to the predictive variable or output of a model [16][17][18]. While sharing a similar objective with the LGBM Classifier, RFE employs a distinct methodology. RFE iteratively eliminates features from the dataset, utilizing the remaining features to construct a model whose performance is then assessed [16][17]. This iterative process continues, generating progressively smaller feature sets until an optimal feature selection is achieved [16][17]. Similar to LGBM, RFE offers several advantages:
Enhanced efficiency achieved through reduced complexity by discarding less important features [16].
Improved accuracy attained by focusing on the most influential features [17].
Mitigation of overfitting by eliminating less relevant features [16].
In this particular implementation, the top 15 features yielding the best model performance were retained.
4.4 15 Parameter Justification¶
num_col = X_train.columns.shape[0];
feature_accuracy_scores = [];
for i in range(1,num_col+1):
top_features = X.columns[sorted_indices[:i]]; # 15 Most important feature
top_importances = result.importances_mean[sorted_indices[:i]];
model = LGBMClassifier();
scores = cross_val_score(model, X[top_features], y, cv=StratifiedKFold(n_splits=10, shuffle = True), scoring='accuracy');
feature_accuracy_scores.append(scores.mean());
#Plotting the graph
plt.plot(range(1, num_col+1), feature_accuracy_scores, marker='o')
plt.title('Feature Accuracy Scores vs Number of features for LGBM Classifier')
plt.xlabel('Number of Features')
plt.ylabel('Accuracy Score')
plt.grid(True)
plt.show()
[LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000063 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 255 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 1 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000053 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 255 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 1 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000053 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 255 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 1 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000054 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 255 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 1 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000054 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 255 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 1 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000054 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 255 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 1 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000060 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 255 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 1 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000072 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 255 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 1 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000057 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 255 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 1 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000051 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 255 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 1 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000108 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000082 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000074 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000075 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000085 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000077 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000054 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000071 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000084 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000080 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000108 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000162 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000087 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000086 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000117 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000089 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000157 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000091 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000089 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000152 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000119 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000234 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000123 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000227 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000124 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000113 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000114 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000123 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000151 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000117 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000167 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000144 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000134 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000135 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000074 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000160 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000167 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000155 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000073 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000139 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000177 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000186 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000168 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000169 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000073 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000205 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000114 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000191 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000191 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000143 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000165 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000305 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000077 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000162 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000105 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000185 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000089 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000179 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000208 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000113 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000131 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000194 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000199 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000211 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000196 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000285 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000190 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000099 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000185 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000190 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000209 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000365 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000196 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000189 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000187 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000178 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000196 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000192 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000197 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000190 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000127 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000129 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000210 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000186 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000187 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000108 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000224 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000201 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000132 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000206 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000211 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000209 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000205 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000197 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000207 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000101 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000139 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000237 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000134 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000205 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000207 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000210 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000107 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000215 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000200 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000308 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000227 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000252 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000209 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000200 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000423 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000640 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000269 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000283 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000374 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000270 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000276 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000634 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000303 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000314 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000438 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000303 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000546 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000650 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000869 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000433 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000584 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000560 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000582 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000308 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000600 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000376 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000549 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000364 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000334 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000605 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000669 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000375 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000597 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000569 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000390 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000349 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000358 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000609 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000628 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000398 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000338 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000506 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000365 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000432 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000404 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000601 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000365 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000339 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000308 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000607 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000320 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000621 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000531 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000530 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000683 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000446 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000487 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000436 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000378 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000318 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000580 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000588 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000643 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000377 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000362 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000421 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000513 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000330 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000923 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000610 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000400 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000570 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000668 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000335 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000729 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000504 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000608 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000807 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000358 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000290 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000353 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000645 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000310 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000355 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000474 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000598 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000743 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000475 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000532 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000511 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000318 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000460 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000385 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000457 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000325 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000304 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000606 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000578 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000526 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000337 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000545 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000607 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000406 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000367 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000319 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000409 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000352 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000375 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000366 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000329 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000401 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000380 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000305 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000392 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000383 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000667 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000639 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000350 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000330 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000373 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000381 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000386 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000351 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000340 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000655 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000606 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000652 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000765 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000632 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000350 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000467 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000361 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000680 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000657 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000624 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000800 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000343 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000339 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000318 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000665 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000391 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000802 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000736 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000391 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000471 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000361 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000350 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000493 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000624 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000707 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000674 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000777 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000330 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000360 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000675 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000645 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000364 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000338 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000655 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000659 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000643 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000724 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000707 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000359 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000343 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000358 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000340 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000351 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000365 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000648 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000688 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000421 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000349 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000707 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000629 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000383 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000603 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000752 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000675 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000336 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000379 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000358 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000343 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000478 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000613 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000698 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000338 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000741 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000597 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000536 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000310 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000320 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Warning] Found whitespace in feature_names, replace with underlines [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000563 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315
num_col = X_train.columns.shape[0];
feature_accuracy_scores = [];
for i in range(2,num_col+1): #start with 2 because RFE needs at least 2 features
top_features = X.columns[sorted_indices[:i]]; # 15 Most important feature
top_importances = result.importances_mean[sorted_indices[:i]];
# Create the RFE object and rank each pixel
# Initialize LightGBM classifier
clf = LGBMClassifier()
# Initialize RFE with LightGBM as the estimator
rfe = RFE(estimator=clf, n_features_to_select=15)
model = rfe;
scores = cross_val_score(model, X[top_features], y, cv=StratifiedKFold(n_splits=10, shuffle = True), scoring='accuracy');
feature_accuracy_scores.append(scores.mean());
#Plotting the graph
plt.plot(range(2, num_col+1), feature_accuracy_scores, marker='o')
plt.title('Feature Accuracy Scores vs Number of features for RFE')
plt.xlabel('Number of Features')
plt.ylabel('Accuracy Score')
plt.grid(True)
plt.show()
[LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000069 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000074 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000077 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000082 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000079 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000082 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000069 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000084 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000072 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000105 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 510 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 2 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000115 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000109 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000091 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000092 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000091 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000096 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000100 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000115 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000089 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000087 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 765 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 3 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000139 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000130 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000121 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000106 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000113 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000132 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000103 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000116 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000103 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000169 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1020 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 4 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000137 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000134 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000139 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000131 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000151 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000150 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000140 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000151 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000145 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000128 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1022 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 5 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000148 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000183 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000177 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000182 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000141 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000151 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000081 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000152 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000168 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000156 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1024 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 6 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000189 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000150 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000089 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000166 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000278 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000165 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000178 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000165 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000085 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000177 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1026 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 7 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000188 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000180 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000214 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000187 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000192 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000187 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000199 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000181 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000181 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1281 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 8 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000102 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000197 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000221 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000207 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000261 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000182 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000199 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000173 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000216 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000180 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1283 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 9 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000178 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000210 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000162 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000199 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000139 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000193 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000179 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000119 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000209 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000305 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1285 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 10 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000187 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000254 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000202 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000127 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000205 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000200 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000201 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000208 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000209 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000263 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1287 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 11 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000240 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000208 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000213 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000249 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000093 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000100 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000118 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000200 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000102 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000223 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1289 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 12 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000778 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000244 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000258 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000268 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000264 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000243 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000247 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000294 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000250 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000353 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1291 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 13 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000275 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000303 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000390 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000351 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000325 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000545 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1293 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 14 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000297 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000583 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000434 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000376 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000295 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000438 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000585 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000329 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000553 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000569 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000530 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000359 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000325 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000668 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000711 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000330 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000373 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000666 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000377 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000378 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000392 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000489 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000305 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000354 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000622 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000574 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000349 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000496 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000757 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000972 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000395 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000329 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000554 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000374 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000834 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000367 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000595 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000337 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000467 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000318 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000678 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000531 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000622 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000537 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000401 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000616 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000340 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000353 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000334 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000536 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000526 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000355 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000904 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000673 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000586 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000436 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000639 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000547 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000441 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000614 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000651 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000609 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000379 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000612 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000582 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000659 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000418 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000407 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000571 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000355 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000454 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000564 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000369 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000555 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000399 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000504 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000458 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000428 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000624 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000352 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000465 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000726 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000465 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000418 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000464 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000375 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000440 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000393 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000465 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000319 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000342 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000379 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000614 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000545 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000563 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000603 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000635 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000355 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000490 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000384 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000314 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000281 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000372 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000419 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000367 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000429 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000340 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000564 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000410 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000399 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000422 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000613 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000556 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000507 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000595 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000349 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000600 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000339 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000325 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000419 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000365 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000337 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000404 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000465 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000351 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000472 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000418 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000494 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000555 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000452 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000518 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000590 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000580 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000581 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000565 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000560 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000623 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000611 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000599 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000588 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000375 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000631 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000363 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000568 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000287 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000346 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000272 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000292 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000538 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000402 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000356 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000442 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000257 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000385 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000402 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000657 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000330 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000413 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000623 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000639 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000611 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000592 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000398 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000277 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000641 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000683 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000562 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000242 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000388 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000363 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000474 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000656 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000361 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000740 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000269 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000594 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000584 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000404 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000742 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000461 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000658 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000487 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000271 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000318 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000265 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000511 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000379 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000522 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000502 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000603 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000267 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000649 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000587 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000294 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000510 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000576 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000286 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000551 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000582 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000318 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000363 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000375 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000252 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000556 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000436 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000358 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000388 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000361 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000302 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000334 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000557 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000231 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000354 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000359 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000355 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000394 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000233 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000330 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000696 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000384 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000277 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000591 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000399 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000290 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000487 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000469 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000288 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000610 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000286 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000476 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000715 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000397 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000656 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000313 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000234 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000268 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000507 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000289 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000404 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000688 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000304 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000565 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000546 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000475 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000608 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000361 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000650 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000482 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000429 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000227 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000947 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000729 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000356 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000529 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000637 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000413 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000229 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000555 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000473 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000379 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000494 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000475 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000579 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000279 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000232 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000620 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000374 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000580 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000100 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000596 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000618 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000596 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000518 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000496 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000388 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000445 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000291 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000243 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000530 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000343 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000839 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000505 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000241 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000319 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000472 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000349 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000413 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000370 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000294 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000370 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000223 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000241 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000296 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000478 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000302 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000392 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000796 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000256 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000294 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000527 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000491 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000256 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000597 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000619 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000343 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000415 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000672 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000643 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000098 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000631 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000561 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000314 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000648 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000480 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000278 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000734 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000256 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000488 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000450 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000470 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000688 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000509 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000527 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000256 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000239 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000283 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000486 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000280 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000597 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000101 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000555 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000279 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000289 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000364 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000564 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000406 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000282 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000313 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000619 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000446 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000499 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000586 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000599 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000310 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000429 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000297 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000374 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000297 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000258 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000263 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000285 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000338 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000535 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000356 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000318 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000259 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000551 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000540 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000906 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000461 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000295 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000310 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000249 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000516 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000236 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000319 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000678 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000337 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000522 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000308 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000670 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000255 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000482 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000510 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000527 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000763 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000294 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000337 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000234 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000254 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000631 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000624 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000733 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000392 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000276 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000357 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000418 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000480 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000288 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000236 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000428 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000464 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000455 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000266 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000329 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000515 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000382 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000869 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000575 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000477 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000286 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000342 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000273 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000267 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000466 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000650 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000244 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000391 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000308 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000434 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000452 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000263 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000118 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000133 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000278 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000627 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000613 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000320 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000507 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000289 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000295 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000458 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000658 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000477 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000343 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000349 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000492 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000296 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000314 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000294 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.001028 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000596 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000253 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000344 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000335 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000342 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000229 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000303 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000460 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000272 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000440 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000544 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000133 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000125 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000251 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000242 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000431 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000349 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000285 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000490 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000380 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000292 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000193 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000589 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000282 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000510 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000282 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000484 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000319 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000361 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000265 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000240 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000591 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000637 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000395 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000301 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000254 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000527 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000297 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000489 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000464 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000342 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000353 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000308 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000537 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000295 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000308 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000293 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000288 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000285 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000597 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000119 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000767 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000670 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000761 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000461 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000261 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000431 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000599 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000615 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000651 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000353 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000440 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000259 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000336 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000140 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000244 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000261 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000294 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000684 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000369 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000339 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000320 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000283 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000319 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000582 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000319 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000308 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000318 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000296 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000325 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000297 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000597 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000141 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000571 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000242 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000751 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000308 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000336 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000279 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000147 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000107 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000558 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000571 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000497 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000452 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000673 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000621 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000706 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000436 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000301 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000296 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000749 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000343 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000605 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000237 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000346 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000355 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000346 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000282 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000650 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000663 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000297 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000356 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000270 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000458 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000673 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000607 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000575 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000678 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000281 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000294 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000513 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000269 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000510 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000479 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000263 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000290 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000290 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000254 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000359 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000305 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000267 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000532 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000487 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000629 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000355 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000619 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000395 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000283 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000245 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000234 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000296 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000379 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000320 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000291 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000597 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000555 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000448 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000559 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000640 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000383 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000602 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000296 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000492 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000289 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000295 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000352 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000267 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000351 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000346 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000349 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000288 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000284 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000562 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000674 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000577 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000297 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000488 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000286 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000329 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000304 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000267 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000250 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000256 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000540 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000330 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000459 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000340 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000271 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000535 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000103 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000694 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000565 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000433 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000346 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000313 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000294 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000363 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000292 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000286 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000110 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000254 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000425 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000293 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000340 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000352 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000499 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000696 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000641 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000416 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000310 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000615 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000305 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000556 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000936 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000268 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000289 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000313 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000595 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000535 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000464 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000446 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000442 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000344 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000304 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000469 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000580 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000122 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000572 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000101 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000104 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000212 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000628 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000305 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000351 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000653 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000261 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000264 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000628 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000310 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000286 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000248 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000740 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000362 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000319 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000290 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000371 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000700 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000261 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000293 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000607 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000296 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000295 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000336 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000337 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000260 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000376 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000597 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000228 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000375 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000302 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000633 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000313 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000356 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000280 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000272 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000264 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000297 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000131 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000402 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000662 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000359 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000313 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000693 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000354 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000360 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000273 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000267 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000257 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000293 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000244 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000258 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000310 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000301 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000308 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000346 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000295 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000267 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000288 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000339 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000351 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000626 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000509 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000305 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000648 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000424 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000679 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000356 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000342 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000278 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000593 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000263 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000609 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000535 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000537 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000310 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000350 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000393 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000319 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000407 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000642 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000421 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000624 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000615 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000639 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000489 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000393 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000710 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000595 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000725 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000580 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000290 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000740 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000722 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000305 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000286 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000508 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000534 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000431 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000262 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000371 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000640 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000645 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000292 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000695 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000266 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000292 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000342 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000102 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000472 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000455 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000624 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000645 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000615 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000290 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000276 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000240 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000305 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000302 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000408 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000447 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000426 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000609 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000289 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000302 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000369 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000302 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000286 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000271 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000513 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000280 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000431 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000500 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000329 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000471 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000391 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000597 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000490 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000502 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000320 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000617 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000378 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000528 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000514 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000294 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000305 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000390 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000339 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000620 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000729 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000558 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000286 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000293 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000310 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000544 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000633 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000314 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000289 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000342 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000302 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000517 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000501 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000489 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000283 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000329 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000271 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000697 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000396 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000437 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000433 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000297 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000666 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000364 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000313 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000336 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000577 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000344 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000294 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000303 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000289 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000593 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000424 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000431 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000334 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000745 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000313 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000486 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000133 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000136 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000280 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000297 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000736 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000343 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000301 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000301 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000349 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000378 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000335 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000478 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000256 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000422 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000283 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000629 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000675 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000334 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000658 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000296 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000283 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000305 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000273 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000272 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000635 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000520 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000249 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000342 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000638 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000313 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000521 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000344 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000351 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.001101 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000480 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000097 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000608 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000239 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000708 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000632 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000364 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000340 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000338 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000346 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000536 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000297 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000276 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000112 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000103 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000554 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000122 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000375 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000301 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000334 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000381 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000371 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000303 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000611 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000440 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000268 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000330 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000644 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000710 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000344 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000344 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000391 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000886 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000267 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000466 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000460 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000612 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000575 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000656 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000477 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000492 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000391 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000295 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000335 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000504 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000296 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000263 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000465 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000711 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000633 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000232 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000655 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000662 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000342 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000694 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000434 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000297 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000491 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000651 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000465 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000285 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000566 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000109 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000594 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000380 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000301 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000682 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000511 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000342 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000448 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000350 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000547 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000330 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000401 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000313 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000305 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000289 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000715 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000581 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000275 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000148 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000143 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000258 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000372 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000685 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000302 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000569 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000349 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000343 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000262 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000280 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000247 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000289 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000320 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000438 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000685 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000693 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000383 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000735 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000352 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000355 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000172 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000539 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000584 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000268 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000752 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000642 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000494 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000336 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000353 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000748 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000347 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000462 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000392 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000334 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000287 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000626 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000447 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000743 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000562 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000571 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000503 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000592 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000682 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000470 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000372 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000373 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000286 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000297 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000676 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000611 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000260 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000416 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000675 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000354 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000359 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000620 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000540 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000521 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000350 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000416 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000815 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000536 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000350 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000603 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000545 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000319 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000420 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000359 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000356 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000415 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000357 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000443 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000368 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000765 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000401 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000338 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000350 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000482 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.001830 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000444 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000395 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000347 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000646 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000372 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000370 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000336 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000295 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000318 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000338 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000122 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000132 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000260 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000335 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000687 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000346 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000360 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000342 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000453 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000424 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000380 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000457 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000446 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000318 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000452 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000612 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000263 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000824 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000478 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000521 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000358 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000660 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000336 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000302 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000361 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000329 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000111 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000278 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000466 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000261 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000743 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000334 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000349 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000507 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000564 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000664 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000358 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000428 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000664 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000372 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000314 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000576 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000720 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000378 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000330 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000339 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000503 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000352 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000282 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000273 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000304 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000267 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000574 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000744 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000672 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000353 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000732 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000729 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000373 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000352 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000495 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000717 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000292 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000276 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000275 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000616 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000473 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000539 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000318 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000539 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000476 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000763 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000314 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000502 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000546 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000475 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000478 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000562 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000472 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000770 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000302 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000591 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000911 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000339 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000438 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000443 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000772 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000354 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000810 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000336 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000681 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000535 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000384 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000395 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000275 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000729 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000344 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000293 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000546 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000424 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000334 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000420 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000681 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000640 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000356 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000295 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000362 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000296 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000342 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000308 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000272 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000282 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000536 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000405 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000417 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000389 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000344 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000350 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000358 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000338 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000630 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000534 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000336 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000313 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000340 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000347 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000389 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000810 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000355 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000671 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000832 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000319 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000563 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000365 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000360 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000282 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000657 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000593 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000335 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000451 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000366 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000422 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000366 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000343 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000372 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000537 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000343 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000338 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000702 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000295 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000603 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000566 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000365 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000373 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000356 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000542 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000302 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000368 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000441 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000290 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000372 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000296 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000493 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000368 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000385 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000294 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000728 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000408 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000717 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000380 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000349 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000303 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000339 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000511 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000403 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000524 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000263 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000256 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000349 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000639 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000403 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000369 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000756 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000340 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000665 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000308 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000464 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000920 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000288 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000313 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000509 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000643 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000640 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000693 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000606 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000412 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000367 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000705 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000714 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000636 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000473 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000305 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000285 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000458 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000793 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000349 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000316 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000745 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000620 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000476 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000280 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000440 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000354 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000500 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000454 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000727 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000803 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000648 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000683 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000782 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000650 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000679 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000711 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000337 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000654 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000526 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000517 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000473 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000359 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000267 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000374 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000759 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000354 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000354 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000320 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000496 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000692 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000368 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000324 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000438 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000334 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000358 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000411 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000334 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000622 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000329 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000310 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000147 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000564 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000359 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000352 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000330 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000369 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000911 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000292 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000303 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000287 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000321 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000109 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000142 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000249 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000392 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000376 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000501 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000355 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000320 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000564 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000528 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000347 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000398 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000342 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000409 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000310 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000337 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000505 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000669 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000320 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000337 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000726 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000319 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000350 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000933 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000343 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000744 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000325 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000500 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000336 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000483 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000358 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000261 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000396 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000376 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000380 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000446 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000386 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000400 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000325 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000484 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000403 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000393 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000335 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000290 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000299 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000423 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000308 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000626 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000546 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000654 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000389 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000337 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000364 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000427 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000364 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000734 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000366 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000347 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000652 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000289 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000329 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000329 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000638 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000594 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000707 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000701 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000381 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000666 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000346 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000302 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000825 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000368 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000339 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000336 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000505 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000511 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000351 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000325 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000335 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000268 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000278 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000650 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000381 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000405 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000694 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000617 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000614 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000318 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000314 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000492 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000366 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000550 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000339 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000434 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000790 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000351 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000372 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000650 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000370 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000683 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000706 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000650 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000294 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000650 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000367 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.001367 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000346 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000722 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000554 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000466 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000671 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000746 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000381 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000510 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000344 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000313 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000369 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000285 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000263 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000659 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000569 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000584 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000702 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000392 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000702 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000359 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000618 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000571 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000386 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000738 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000741 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000667 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000317 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000593 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000411 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000363 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000501 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000261 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000640 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000734 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000398 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000755 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000342 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000319 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000638 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000720 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000402 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000627 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000284 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000346 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000262 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000378 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000329 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000422 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000688 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000472 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000363 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000367 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000370 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000465 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000314 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000362 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000367 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000337 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000500 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000704 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000340 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000729 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000361 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000365 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000721 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000374 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000768 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000459 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000320 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000753 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000293 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000354 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000596 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000520 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000394 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000387 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000347 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000337 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000346 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000329 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000340 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000358 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000323 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000501 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000596 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000491 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000373 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000368 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000330 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000475 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000592 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000303 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000357 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000680 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000334 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000722 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000429 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000465 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000686 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000291 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000654 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000274 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000694 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000367 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000754 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000442 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000346 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000532 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000410 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000455 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000439 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000390 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000361 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000736 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000339 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000593 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000290 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000615 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000697 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000672 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000344 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000847 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000370 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000746 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000292 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000483 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000418 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000549 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000421 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000336 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000357 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000429 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000470 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000358 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000380 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000383 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000781 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000473 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000340 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000359 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000743 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000307 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000509 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000493 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000338 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000320 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000435 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000611 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000248 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000454 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000533 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000515 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000613 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000502 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000345 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000494 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000642 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000335 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000325 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000692 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000636 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000573 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000517 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000424 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000270 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000277 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000481 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000538 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000382 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3940, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000507 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7823, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503643 -> initscore=0.014573 [LightGBM] [Info] Start training from score 0.014573 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000368 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000370 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000403 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000483 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000787 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000713 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000355 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000347 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000370 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000365 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000331 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000516 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000695 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000638 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000278 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000609 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000284 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000552 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000573 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000408 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000301 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000341 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000412 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000371 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000710 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000604 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000329 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000354 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000546 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000750 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000544 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000536 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000381 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3941, number of negative: 3883 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503707 -> initscore=0.014826 [LightGBM] [Info] Start training from score 0.014826 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000355 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000318 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000339 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000350 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000369 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000486 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000375 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000670 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000357 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000505 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000337 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000303 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000514 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000389 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000301 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000376 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000298 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000241 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000516 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000718 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000797 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000379 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000614 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000294 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000347 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000457 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000525 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000681 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000344 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000714 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000401 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000269 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000347 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000312 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000743 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000127 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000282 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000607 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000328 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000651 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000347 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000635 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000750 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000671 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000356 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000313 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000376 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000364 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000319 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000347 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000300 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000395 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000464 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000348 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000335 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000315 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000228 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000333 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000389 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000398 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000397 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000339 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000344 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000334 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000379 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000338 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000326 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000403 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000309 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000327 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000297 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000311 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000570 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000330 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000383 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1331 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 33 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000502 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1329 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 32 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000322 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1327 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 31 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000330 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1325 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 30 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000336 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1323 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 29 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000383 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1321 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 28 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000346 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1319 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 27 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000306 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1317 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 26 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000692 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1315 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 25 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000661 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1313 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 24 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000358 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1311 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 23 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000332 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1309 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 22 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000359 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1307 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 21 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000363 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1305 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 20 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000308 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1303 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 19 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000302 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1301 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 18 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000463 seconds. You can set `force_row_wise=true` to remove the overhead. And if memory is not enough, you can set `force_col_wise=true`. [LightGBM] [Info] Total Bins 1299 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 17 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000625 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1297 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 16 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315 [LightGBM] [Info] Number of positive: 3940, number of negative: 3884 [LightGBM] [Info] Auto-choosing col-wise multi-threading, the overhead of testing was 0.000594 seconds. You can set `force_col_wise=true` to remove the overhead. [LightGBM] [Info] Total Bins 1295 [LightGBM] [Info] Number of data points in the train set: 7824, number of used features: 15 [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.503579 -> initscore=0.014315 [LightGBM] [Info] Start training from score 0.014315
The accuracy score for the LGBM classifier does not change significantly when using more than 11 features, while the accuracy score for RFE does not change significantly after 14 features. The accuracy score when using LGBM classifier is generally slightly higher than that of RFE when using more than 14 features. Therfore, the top 15 features when using the LGBM classifier have been selected and are used for the remainder of the project. Choosing to use less features is appropriate, as it reduces computational complexity, and does not cause a notable decrease in accuracy.
5. Model Tuning¶
Model tuning, also known as hyperparameter optimization, involves finding the optimal combination of parameters to control the learning process of a machine learning model [19]. These hyperparameters play a crucial role in determining the performance of the model. Unlike model parameters, which are learned from the training data, hyperparameters are predefined settings that are not derived from the data itself [20].
Three methods commonly used for exploring hyperparameter optimization are GridSearch, RandomSearch and Optuna [20]. In Section 5.1, GridSearch is employed to search through a predefined grid of hyperparameters. In Section 5.2, Optuna is utilized for hyperparameter optimization efficient exploration of the hyperparameter space [20]. In Section 5.3 Random Search is utilized for hyperparameter optimization.
The comparison of the performance of these two hyperparameter optimization methods is conducted in Section 5.4, providing insights into their effectiveness in improving model performance.
5.1 Model Tuning using GridSearch¶
GridSearch is a hyperparameter optimization method utilized in machine learning to identify the optimal combination of hyperparameters [21]. It involves an exhaustive search through a manually specified hyperparameter space [21], employing cross-validation or other evaluation methods for assessing the hyperparameters. This method is potent as it ensures the discovery of the most optimal solution within the defined grid, albeit it can be time-consuming, especially for extensive hyperparameter spaces.
gridSearch_modelsAccuracy = []
5.1.1 Logistic Regression¶
The logistic regression model, employing the sigmoid function, serves as a statistical tool for binary classification [22][23], determining the probability of a data entry belonging to one of two categories [23]. This probability is derived from applying a logistic function to a linear equation, yielding a value between 0 and 1 [23].
Advantages and disadvantages of logistic regression are summarized as follows [24]:
Advantages:
- Simple and easy to grasp, making it an ideal starting point for classification tasks.
- Computational simplicity leads to faster training compared to other models.
- Coefficients offer insights into feature importance.
- Can be extended to handle multi-class classification problems.
Disadvantages:
- Assumes a linear relationship between independent variables and the log odds of the dependent variable.
- Limited when data isn't linearly separable.
- Fails to capture complex relationships within data.
While logistic regression presents computational simplicity, its efficacy may be hindered when data lacks a clear linear relationship, as observed earlier in this notebook.
In the implementation below [25][26], focus is placed on the following hyperparameters:
- "C": Represents the inverse of regularization strength, preventing overfitting. Smaller "C" values penalize larger model parameters, while larger values emphasize the training data, potentially leading to a more complex model.
- "penalty": Dictates the type of regularization applied. "L1" (Lasso regularization) reduces parameters by setting some coefficients to zero, while "L2" (Ridge regularization) shrinks coefficients towards zero.
param_grid = {
'C': [0.001, 0.01, 0.1, 1, 10, 100],
'penalty': ['l1', 'l2']
}
X_train, X_test, y_train, y_test = train_test_split(X_FeatureSelection_LGM, y, test_size=0.2, random_state=42 )
lr = LogisticRegression(solver='liblinear')
grid_search = GridSearchCV(estimator=lr, param_grid=param_grid, cv=cv, scoring='accuracy',return_train_score=True, verbose = 0)
grid_search.fit(X_train, y_train)
best_params = grid_search.best_params_
best_lr = grid_search.best_estimator_
accuracy = best_lr.score(X_test, y_test)
gridSearch_modelsAccuracy.append(accuracy)
print(f"Best Parameters: {best_params}")
print(f"Test Accuracy: {accuracy:.4f}")
Best Parameters: {'C': 100, 'penalty': 'l1'}
Test Accuracy: 0.7855
5.1.2 Random Forest Classifier¶
param_grid = {
'n_estimators': [100, 200, 300],
'max_depth': [None, 10, 20],
'min_samples_split': [2, 5, 10],
'min_samples_leaf': [1, 2, 4]
}
rf_classifier = RandomForestClassifier()
grid_search = GridSearchCV(estimator=rf_classifier, param_grid=param_grid, cv=cv, n_jobs=-1 , scoring='accuracy',return_train_score=True, verbose = 0)
grid_search.fit(X_train, y_train)
best_params = grid_search.best_params_
best_rf = grid_search.best_estimator_
accuracy = best_rf.score(X_test, y_test)
gridSearch_modelsAccuracy.append(accuracy)
print(f"Best Parameters: {best_params}")
print(f"Test Accuracy: {accuracy:.4f}")
Best Parameters: {'max_depth': 20, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 200}
Test Accuracy: 0.7826
The Random Forest Classifier operates by constructing multiple decision trees, each utilizing a random subset of features from the dataset [27][28]. Prediction is made by aggregating the results of all trees through voting in classification tasks or averaging for regression tasks [28]. Notably, this method demonstrates several advantageous features [28]:
- High predictive accuracy
- Resistance to overfitting
- Ability to handle large datasets
- Improved speed through parallelization
In the implementation, the Random Forest Classifier was optimized using grid search, considering four key hyperparameters [29]:
- Number of trees to be created (n_estimators)
- Maximum depth of the trees (max_depth)
- Minimum number of samples required to split an internal node (min_samples_split)
- Minimum number of samples required to be at a leaf node (min_samples_leaf)
Additionally, 5-fold cross-validation was employed, and n_jobs was set to -1 to utilize all available CPUs for computation [k]. The optimal hyperparameters determined through this process and are printed above.
5.1.3 Naive Bayes¶
5.1.3.1 Gaussian Naive Bayes¶
from sklearn.naive_bayes import GaussianNB
# create the Gaussian Naive Bayes classifier
gnb = GaussianNB()
# Define the parameter grid
param_grid = {
'var_smoothing': np.logspace(0,-9, num=100)
}
# Instantiate the GridSearchCV object
grid_search = GridSearchCV(estimator=gnb, param_grid=param_grid, cv=cv, verbose = 0)
# Fit the model
grid_search.fit(X_train, y_train)
# Get the best parameters
best_params = grid_search.best_params_
best_rf = grid_search.best_estimator_
print(f"Best parameters: {best_params}")
# Use the best estimator to make predictions
best_gnb = grid_search.best_estimator_
predictions = best_gnb.predict(X_test)
accuracy = best_rf.score(X_test, y_test)
gridSearch_modelsAccuracy.append(accuracy)
print(f"Test Accuracy: {accuracy:.4f}")
Best parameters: {'var_smoothing': 6.579332246575683e-05}
Test Accuracy: 0.7079
Naive Bayes Classification methods are grounded in probability theory, specifically Bayes' Theorem [30]. Unlike other methods focusing on feature importance, Naive Bayes Classifiers aim to understand the distribution of inputs across classes [30][31]. They make a fundamental assumption that all features are independent of each other [32].
Gaussian Naive Bayes (GNB) is a variant of Naive Bayes that assumes features follow a Gaussian Distribution [32][33]. During operation, GNB calculates the probability of an instance belonging to each class, selecting the class with the highest probability as the prediction [33]. It excels in both binary and multi-class classification, particularly when the assumption of feature independence holds true [31]. GNB is advantageous for handling datasets with numerous features, as it doesn't suffer from dimensionality issues.
The parameter that can be adjusted in GNB is var_smoothing, which accommodates data not conforming to the Gaussian Distribution [34]. A logarithmic scale ranging from 10^0 to 10^-9 was explored for var_smoothing, yielding the optimal value above. Similar to previous models, 5-fold cross-validation was employed [34].
5.1.3.2 Bernoulli Naive Bayes¶
from sklearn.naive_bayes import BernoulliNB
# Instantiate the Bernoulli Naive Bayes classifier
bnb = BernoulliNB()
# Define the parameter grid
param_grid = {
'alpha': [0.01, 0.1, 1.0, 10.0, 100.0], # Additive (Laplace/Lidstone) smoothing parameter
'binarize': [0.0, 0.5, 1.0], # Threshold for binarizing of sample features
'fit_prior': [True, False] # Whether to learn class prior probabilities or not
}
# Instantiate the GridSearchCV object
grid_search = GridSearchCV(estimator=bnb, param_grid=param_grid, cv=cv,return_train_score=True, verbose = 0)
# Fit the model
grid_search.fit(X_train, y_train)
# Get the best parameters
best_params = grid_search.best_params_
print(f"Best parameters: {best_params}")
# Use the best estimator to make predictions
best_bnb = grid_search.best_estimator_
predictions = best_bnb.predict(X_test)
# Calculate the accuracy
accuracy = best_bnb.score(X_test, y_test)
gridSearch_modelsAccuracy.append(accuracy)
print(f"Test Accuracy: {accuracy:.4f}")
Best parameters: {'alpha': 0.01, 'binarize': 0.0, 'fit_prior': False}
Test Accuracy: 0.7487
The Bernoulli Naive Bayes (BNB) method operates similarly to Gaussian Naive Bayes but assumes that all features are binary, making predictions based on the Bernoulli Distribution. All other assumptions and methods of GNB apply to BNB as well [36][37].
Various hyperparameters were considered for tuning:
- Additive smoothing (alpha)
- The threshold for mapping features to booleans (binarize)
- Whether or not to consider prior probabilities of the class (fit_prior)
After hyperparameter tuning using 5-fold cross-validation, the optimal parameters were found and printed above. Notably, the test accuracy of the BNB method was found to be superior to that of the GNB method.
5.1.4 XGBoostClassifier¶
The XGBoostClassifier is an advanced implementation of the gradient boosting algorithm for decision trees [38]. It works by creating a number of decision trees in a sequential manner such that each tree is able to correct the errors of the previous one [38]. This correction forces the model to aim at improving performance.
Some advantages and disadvantages of the XGBoostClassifier are outlined below [38]:
Advantages
- The XGBoostClassifier tends to produce better results than other models.
- The model is optimized for speed and performance. This is done by utilizing paralle processing to train models.
- The model can handle missing data without requiring the use of an imputater.
- It utilizes both the L1 and L2 regularization mentioned in logistic regression to prevent overfitting.
- Is capable of handling a larger dataset and allows for custom optimization and evaluation criteria.
Disadvantages
- The model may require tuning to achieve good results
- If the model is not configured correctly it can lead to overfitting
- It can be resource intensive when used with large datasets.
In the implementation below the hyperameters that are explored will be explained below [39][40].
- "n_estimators": This defines the number of tress in the models forest. The more trees that are added can improve the ability of the mode learning from the data but can eventually lead to overfitting
- "max_depth": This parameter sets the maximum depth of the each tree. The deeper the trees the more complex data can be intepreted but it can also lead to overfitting.
- "min_samples_split": This parameter is the minimum number of samples required to split an internal node causing branching of the trees. This is utilized to control the overfitting of the model. A smaller value causes more children to be created causing more complex trees
- "min_samples_leaf": This is the minimum number of samples required to be at a leaf node. This creates a split point at any depth that will only occur if there a certain number of training samples associated to each leaf node.
Here are the IEEE-style references for the provided URLs:
param_grid = {
'n_estimators': [100, 200, 300],
'max_depth': [None, 10, 20],
'min_samples_split': [2, 5, 10],
'min_samples_leaf': [1, 2, 4]
}
rf_classifier = XGBClassifier()
grid_search = GridSearchCV(estimator=rf_classifier, param_grid=param_grid, cv=cv, n_jobs=-1,scoring='accuracy',return_train_score=True, verbose = 0)
grid_search.fit(X_train, y_train)
best_params = grid_search.best_params_
best_XG = grid_search.best_estimator_
accuracy = best_XG.score(X_test, y_test)
gridSearch_modelsAccuracy.append(accuracy)
print(f"Best Parameters: {best_params}")
print(f"Test Accuracy: {accuracy:.4f}")
Best Parameters: {'max_depth': None, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 100}
Test Accuracy: 0.7867
5.1.5 CatBoostClassifier¶
The CatBoostClassifier, or Categorical Booster (CatBoost), is a boosting classification library designed for scenarios with numerous independent features [41]. It utilizes gradient boosting to create a series of decision tree models, where each new tree aims to correct the errors of the previous one [42]. This approach offers two main advantages: achieving highly accurate evaluations on various classification problems and reducing the risk of overfitting [41].
One distinctive feature of CatBoost is its ability to handle categorical features without requiring data conversion techniques like One Hot Encoding (OHE). However, in this context, the usage of this feature was not applied as it was deemed unnecessary after addressing issues such as multicollinearity following the data encoding process.
param_grid = {
'iterations': [100, 200, 300],
'depth': [4, 5, 6],
'learning_rate': [0.1, 0.01, 0.05]
}
catboost_classifier = CatBoostClassifier()
grid_search = GridSearchCV(estimator=catboost_classifier, param_grid=param_grid, cv=cv, n_jobs=-1, scoring='accuracy', return_train_score=True, verbose = 0)
grid_search.fit(X_train, y_train)
best_params = grid_search.best_params_
best_catboost = grid_search.best_estimator_
accuracy = best_catboost.score(X_test, y_test)
gridSearch_modelsAccuracy.append(accuracy)
print(f"Best Parameters: {best_params}")
print(f"Test Accuracy: {accuracy:.4f}")
0: learn: 0.6739366 total: 2.77ms remaining: 828ms
1: learn: 0.6593624 total: 5.71ms remaining: 850ms
2: learn: 0.6436339 total: 8.04ms remaining: 796ms
3: learn: 0.6285379 total: 10.8ms remaining: 798ms
4: learn: 0.6146093 total: 13.4ms remaining: 790ms
5: learn: 0.6022170 total: 15.8ms remaining: 775ms
6: learn: 0.5887170 total: 18.3ms remaining: 766ms
7: learn: 0.5812913 total: 20.8ms remaining: 759ms
8: learn: 0.5711418 total: 23.4ms remaining: 757ms
9: learn: 0.5624255 total: 25.9ms remaining: 751ms
10: learn: 0.5547498 total: 28.4ms remaining: 746ms
11: learn: 0.5475722 total: 31ms remaining: 744ms
12: learn: 0.5402905 total: 33.5ms remaining: 740ms
13: learn: 0.5319551 total: 36.3ms remaining: 741ms
14: learn: 0.5250905 total: 38.8ms remaining: 736ms
15: learn: 0.5190250 total: 41.4ms remaining: 736ms
16: learn: 0.5139808 total: 44ms remaining: 733ms
17: learn: 0.5094848 total: 46.5ms remaining: 728ms
18: learn: 0.5052114 total: 49.1ms remaining: 726ms
19: learn: 0.5007622 total: 51.7ms remaining: 724ms
20: learn: 0.4968605 total: 54.5ms remaining: 723ms
21: learn: 0.4938920 total: 57.2ms remaining: 723ms
22: learn: 0.4909334 total: 60ms remaining: 723ms
23: learn: 0.4873177 total: 62.7ms remaining: 721ms
24: learn: 0.4852585 total: 65.8ms remaining: 723ms
25: learn: 0.4823517 total: 68.4ms remaining: 721ms
26: learn: 0.4790039 total: 71.4ms remaining: 722ms
27: learn: 0.4763329 total: 74.3ms remaining: 722ms
28: learn: 0.4738372 total: 77.2ms remaining: 721ms
29: learn: 0.4713284 total: 80.5ms remaining: 724ms
30: learn: 0.4694716 total: 83.5ms remaining: 725ms
31: learn: 0.4669372 total: 86.7ms remaining: 726ms
32: learn: 0.4634880 total: 89.4ms remaining: 724ms
33: learn: 0.4626117 total: 92.4ms remaining: 723ms
34: learn: 0.4609056 total: 95.5ms remaining: 723ms
35: learn: 0.4590387 total: 98.5ms remaining: 722ms
36: learn: 0.4571401 total: 101ms remaining: 721ms
37: learn: 0.4544456 total: 104ms remaining: 718ms
38: learn: 0.4527240 total: 107ms remaining: 718ms
39: learn: 0.4514700 total: 110ms remaining: 713ms
40: learn: 0.4495645 total: 112ms remaining: 708ms
41: learn: 0.4480045 total: 115ms remaining: 704ms
42: learn: 0.4468299 total: 117ms remaining: 700ms
43: learn: 0.4456453 total: 120ms remaining: 696ms
44: learn: 0.4440686 total: 122ms remaining: 691ms
45: learn: 0.4433472 total: 125ms remaining: 689ms
46: learn: 0.4417583 total: 127ms remaining: 685ms
47: learn: 0.4399372 total: 130ms remaining: 680ms
48: learn: 0.4383729 total: 132ms remaining: 677ms
49: learn: 0.4368504 total: 135ms remaining: 673ms
50: learn: 0.4346292 total: 137ms remaining: 670ms
51: learn: 0.4338235 total: 140ms remaining: 666ms
52: learn: 0.4333436 total: 142ms remaining: 663ms
53: learn: 0.4322575 total: 145ms remaining: 662ms
54: learn: 0.4306051 total: 148ms remaining: 659ms
55: learn: 0.4290361 total: 151ms remaining: 656ms
56: learn: 0.4282366 total: 153ms remaining: 653ms
57: learn: 0.4275012 total: 156ms remaining: 650ms
58: learn: 0.4266082 total: 158ms remaining: 647ms
59: learn: 0.4254381 total: 161ms remaining: 645ms
60: learn: 0.4243510 total: 164ms remaining: 643ms
61: learn: 0.4237953 total: 167ms remaining: 639ms
62: learn: 0.4233403 total: 169ms remaining: 637ms
63: learn: 0.4221972 total: 172ms remaining: 635ms
64: learn: 0.4215764 total: 175ms remaining: 632ms
65: learn: 0.4209366 total: 178ms remaining: 630ms
66: learn: 0.4203945 total: 180ms remaining: 627ms
67: learn: 0.4197136 total: 183ms remaining: 624ms
68: learn: 0.4195676 total: 185ms remaining: 618ms
69: learn: 0.4187578 total: 187ms remaining: 615ms
70: learn: 0.4183443 total: 190ms remaining: 612ms
71: learn: 0.4178321 total: 192ms remaining: 608ms
72: learn: 0.4166280 total: 194ms remaining: 604ms
73: learn: 0.4158579 total: 197ms remaining: 602ms
74: learn: 0.4151371 total: 200ms remaining: 599ms
75: learn: 0.4146877 total: 202ms remaining: 596ms
76: learn: 0.4143636 total: 205ms remaining: 593ms
77: learn: 0.4137800 total: 208ms remaining: 591ms
78: learn: 0.4132274 total: 210ms remaining: 588ms
79: learn: 0.4120097 total: 213ms remaining: 585ms
80: learn: 0.4116719 total: 215ms remaining: 582ms
81: learn: 0.4113345 total: 218ms remaining: 579ms
82: learn: 0.4105104 total: 221ms remaining: 577ms
83: learn: 0.4098422 total: 224ms remaining: 575ms
84: learn: 0.4093529 total: 227ms remaining: 573ms
85: learn: 0.4087296 total: 229ms remaining: 571ms
86: learn: 0.4084574 total: 232ms remaining: 569ms
87: learn: 0.4080142 total: 235ms remaining: 567ms
88: learn: 0.4071972 total: 239ms remaining: 566ms
89: learn: 0.4067251 total: 241ms remaining: 563ms
90: learn: 0.4064025 total: 244ms remaining: 561ms
91: learn: 0.4060936 total: 247ms remaining: 558ms
92: learn: 0.4056478 total: 250ms remaining: 555ms
93: learn: 0.4054595 total: 252ms remaining: 553ms
94: learn: 0.4050402 total: 255ms remaining: 550ms
95: learn: 0.4046816 total: 258ms remaining: 547ms
96: learn: 0.4034952 total: 260ms remaining: 545ms
97: learn: 0.4031509 total: 263ms remaining: 542ms
98: learn: 0.4027419 total: 266ms remaining: 539ms
99: learn: 0.4024633 total: 268ms remaining: 537ms
100: learn: 0.4021472 total: 271ms remaining: 534ms
101: learn: 0.4014845 total: 274ms remaining: 531ms
102: learn: 0.4012768 total: 276ms remaining: 528ms
103: learn: 0.4006063 total: 279ms remaining: 526ms
104: learn: 0.4002696 total: 282ms remaining: 524ms
105: learn: 0.3996867 total: 285ms remaining: 521ms
106: learn: 0.3993284 total: 287ms remaining: 518ms
107: learn: 0.3988177 total: 290ms remaining: 515ms
108: learn: 0.3984398 total: 292ms remaining: 513ms
109: learn: 0.3981433 total: 295ms remaining: 510ms
110: learn: 0.3972292 total: 298ms remaining: 507ms
111: learn: 0.3969387 total: 301ms remaining: 505ms
112: learn: 0.3966127 total: 303ms remaining: 502ms
113: learn: 0.3962639 total: 306ms remaining: 499ms
114: learn: 0.3960581 total: 309ms remaining: 496ms
115: learn: 0.3956563 total: 311ms remaining: 494ms
116: learn: 0.3951298 total: 314ms remaining: 491ms
117: learn: 0.3948752 total: 316ms remaining: 488ms
118: learn: 0.3945733 total: 319ms remaining: 485ms
119: learn: 0.3941988 total: 322ms remaining: 482ms
120: learn: 0.3938866 total: 324ms remaining: 480ms
121: learn: 0.3934901 total: 327ms remaining: 477ms
122: learn: 0.3931483 total: 329ms remaining: 474ms
123: learn: 0.3925805 total: 332ms remaining: 471ms
124: learn: 0.3920817 total: 334ms remaining: 468ms
125: learn: 0.3918896 total: 337ms remaining: 465ms
126: learn: 0.3916775 total: 340ms remaining: 463ms
127: learn: 0.3914890 total: 343ms remaining: 460ms
128: learn: 0.3911654 total: 346ms remaining: 458ms
129: learn: 0.3907279 total: 348ms remaining: 455ms
130: learn: 0.3905810 total: 351ms remaining: 453ms
131: learn: 0.3896978 total: 353ms remaining: 450ms
132: learn: 0.3893440 total: 356ms remaining: 447ms
133: learn: 0.3890951 total: 359ms remaining: 444ms
134: learn: 0.3888747 total: 361ms remaining: 441ms
135: learn: 0.3887308 total: 364ms remaining: 439ms
136: learn: 0.3885585 total: 367ms remaining: 436ms
137: learn: 0.3883039 total: 369ms remaining: 433ms
138: learn: 0.3876834 total: 372ms remaining: 431ms
139: learn: 0.3873249 total: 375ms remaining: 428ms
140: learn: 0.3871018 total: 377ms remaining: 425ms
141: learn: 0.3868185 total: 379ms remaining: 422ms
142: learn: 0.3864413 total: 382ms remaining: 419ms
143: learn: 0.3859324 total: 385ms remaining: 417ms
144: learn: 0.3857280 total: 387ms remaining: 414ms
145: learn: 0.3854902 total: 389ms remaining: 410ms
146: learn: 0.3852832 total: 392ms remaining: 408ms
147: learn: 0.3849587 total: 394ms remaining: 405ms
148: learn: 0.3848302 total: 396ms remaining: 402ms
149: learn: 0.3846995 total: 399ms remaining: 399ms
150: learn: 0.3844690 total: 402ms remaining: 396ms
151: learn: 0.3841641 total: 404ms remaining: 394ms
152: learn: 0.3839813 total: 406ms remaining: 390ms
153: learn: 0.3837013 total: 409ms remaining: 388ms
154: learn: 0.3835294 total: 412ms remaining: 385ms
155: learn: 0.3832928 total: 415ms remaining: 383ms
156: learn: 0.3831129 total: 417ms remaining: 380ms
157: learn: 0.3829309 total: 421ms remaining: 378ms
158: learn: 0.3827205 total: 424ms remaining: 376ms
159: learn: 0.3824540 total: 427ms remaining: 374ms
160: learn: 0.3822927 total: 430ms remaining: 371ms
161: learn: 0.3815557 total: 433ms remaining: 369ms
162: learn: 0.3813730 total: 436ms remaining: 366ms
163: learn: 0.3811192 total: 439ms remaining: 364ms
164: learn: 0.3808145 total: 442ms remaining: 361ms
165: learn: 0.3804934 total: 444ms remaining: 359ms
166: learn: 0.3799383 total: 447ms remaining: 356ms
167: learn: 0.3795080 total: 450ms remaining: 353ms
168: learn: 0.3791871 total: 452ms remaining: 351ms
169: learn: 0.3787181 total: 455ms remaining: 348ms
170: learn: 0.3784387 total: 457ms remaining: 345ms
171: learn: 0.3781385 total: 460ms remaining: 343ms
172: learn: 0.3777874 total: 463ms remaining: 340ms
173: learn: 0.3775242 total: 466ms remaining: 337ms
174: learn: 0.3773230 total: 469ms remaining: 335ms
175: learn: 0.3770024 total: 472ms remaining: 332ms
176: learn: 0.3765382 total: 474ms remaining: 330ms
177: learn: 0.3759929 total: 477ms remaining: 327ms
178: learn: 0.3757854 total: 480ms remaining: 324ms
179: learn: 0.3755194 total: 482ms remaining: 321ms
180: learn: 0.3750386 total: 485ms remaining: 319ms
181: learn: 0.3747739 total: 487ms remaining: 316ms
182: learn: 0.3744914 total: 490ms remaining: 313ms
183: learn: 0.3743808 total: 493ms remaining: 310ms
184: learn: 0.3738626 total: 495ms remaining: 308ms
185: learn: 0.3735731 total: 498ms remaining: 305ms
186: learn: 0.3733160 total: 501ms remaining: 302ms
187: learn: 0.3731603 total: 503ms remaining: 300ms
188: learn: 0.3728384 total: 506ms remaining: 297ms
189: learn: 0.3726399 total: 508ms remaining: 294ms
190: learn: 0.3724398 total: 511ms remaining: 291ms
191: learn: 0.3722837 total: 513ms remaining: 289ms
192: learn: 0.3720209 total: 516ms remaining: 286ms
193: learn: 0.3717476 total: 519ms remaining: 283ms
194: learn: 0.3714705 total: 521ms remaining: 281ms
195: learn: 0.3713382 total: 524ms remaining: 278ms
196: learn: 0.3711709 total: 527ms remaining: 275ms
197: learn: 0.3708582 total: 530ms remaining: 273ms
198: learn: 0.3706232 total: 532ms remaining: 270ms
199: learn: 0.3699890 total: 535ms remaining: 267ms
200: learn: 0.3696805 total: 538ms remaining: 265ms
201: learn: 0.3694489 total: 541ms remaining: 262ms
202: learn: 0.3692149 total: 543ms remaining: 260ms
203: learn: 0.3690656 total: 546ms remaining: 257ms
204: learn: 0.3687943 total: 548ms remaining: 254ms
205: learn: 0.3684040 total: 551ms remaining: 251ms
206: learn: 0.3682428 total: 554ms remaining: 249ms
207: learn: 0.3680739 total: 556ms remaining: 246ms
208: learn: 0.3677226 total: 559ms remaining: 243ms
209: learn: 0.3674395 total: 562ms remaining: 241ms
210: learn: 0.3672409 total: 564ms remaining: 238ms
211: learn: 0.3670401 total: 567ms remaining: 235ms
212: learn: 0.3667001 total: 570ms remaining: 233ms
213: learn: 0.3664895 total: 572ms remaining: 230ms
214: learn: 0.3662999 total: 575ms remaining: 227ms
215: learn: 0.3660268 total: 577ms remaining: 224ms
216: learn: 0.3657543 total: 580ms remaining: 222ms
217: learn: 0.3656447 total: 582ms remaining: 219ms
218: learn: 0.3652299 total: 585ms remaining: 216ms
219: learn: 0.3649866 total: 587ms remaining: 214ms
220: learn: 0.3648327 total: 590ms remaining: 211ms
221: learn: 0.3646639 total: 592ms remaining: 208ms
222: learn: 0.3643689 total: 594ms remaining: 205ms
223: learn: 0.3639576 total: 597ms remaining: 203ms
224: learn: 0.3637293 total: 599ms remaining: 200ms
225: learn: 0.3634335 total: 602ms remaining: 197ms
226: learn: 0.3631275 total: 604ms remaining: 194ms
227: learn: 0.3628025 total: 606ms remaining: 191ms
228: learn: 0.3626117 total: 608ms remaining: 189ms
229: learn: 0.3623132 total: 611ms remaining: 186ms
230: learn: 0.3621884 total: 613ms remaining: 183ms
231: learn: 0.3620843 total: 616ms remaining: 180ms
232: learn: 0.3619316 total: 618ms remaining: 178ms
233: learn: 0.3618082 total: 620ms remaining: 175ms
234: learn: 0.3615860 total: 622ms remaining: 172ms
235: learn: 0.3613032 total: 625ms remaining: 169ms
236: learn: 0.3611773 total: 627ms remaining: 167ms
237: learn: 0.3608917 total: 630ms remaining: 164ms
238: learn: 0.3607335 total: 632ms remaining: 161ms
239: learn: 0.3603408 total: 635ms remaining: 159ms
240: learn: 0.3601496 total: 638ms remaining: 156ms
241: learn: 0.3597339 total: 641ms remaining: 154ms
242: learn: 0.3593404 total: 643ms remaining: 151ms
243: learn: 0.3591866 total: 646ms remaining: 148ms
244: learn: 0.3588987 total: 648ms remaining: 146ms
245: learn: 0.3586951 total: 650ms remaining: 143ms
246: learn: 0.3585110 total: 653ms remaining: 140ms
247: learn: 0.3583323 total: 655ms remaining: 137ms
248: learn: 0.3580908 total: 657ms remaining: 135ms
249: learn: 0.3579385 total: 659ms remaining: 132ms
250: learn: 0.3577839 total: 662ms remaining: 129ms
251: learn: 0.3574186 total: 664ms remaining: 127ms
252: learn: 0.3567897 total: 667ms remaining: 124ms
253: learn: 0.3567102 total: 669ms remaining: 121ms
254: learn: 0.3563721 total: 672ms remaining: 119ms
255: learn: 0.3561817 total: 674ms remaining: 116ms
256: learn: 0.3559765 total: 677ms remaining: 113ms
257: learn: 0.3557541 total: 679ms remaining: 111ms
258: learn: 0.3556523 total: 682ms remaining: 108ms
259: learn: 0.3553979 total: 685ms remaining: 105ms
260: learn: 0.3552176 total: 687ms remaining: 103ms
261: learn: 0.3550190 total: 690ms remaining: 100ms
262: learn: 0.3547729 total: 692ms remaining: 97.4ms
263: learn: 0.3546398 total: 695ms remaining: 94.7ms
264: learn: 0.3542717 total: 697ms remaining: 92ms
265: learn: 0.3542098 total: 699ms remaining: 89.4ms
266: learn: 0.3539418 total: 702ms remaining: 86.8ms
267: learn: 0.3537465 total: 704ms remaining: 84.1ms
268: learn: 0.3535946 total: 707ms remaining: 81.5ms
269: learn: 0.3533479 total: 709ms remaining: 78.8ms
270: learn: 0.3530499 total: 711ms remaining: 76.1ms
271: learn: 0.3529576 total: 714ms remaining: 73.5ms
272: learn: 0.3527425 total: 717ms remaining: 70.9ms
273: learn: 0.3524942 total: 719ms remaining: 68.3ms
274: learn: 0.3522889 total: 722ms remaining: 65.6ms
275: learn: 0.3519746 total: 724ms remaining: 63ms
276: learn: 0.3516613 total: 727ms remaining: 60.4ms
277: learn: 0.3514632 total: 730ms remaining: 57.7ms
278: learn: 0.3513400 total: 732ms remaining: 55.1ms
279: learn: 0.3510818 total: 735ms remaining: 52.5ms
280: learn: 0.3508790 total: 737ms remaining: 49.8ms
281: learn: 0.3507711 total: 740ms remaining: 47.2ms
282: learn: 0.3506216 total: 742ms remaining: 44.6ms
283: learn: 0.3502860 total: 744ms remaining: 41.9ms
284: learn: 0.3500874 total: 747ms remaining: 39.3ms
285: learn: 0.3499778 total: 750ms remaining: 36.7ms
286: learn: 0.3496890 total: 752ms remaining: 34.1ms
287: learn: 0.3495290 total: 755ms remaining: 31.4ms
288: learn: 0.3493798 total: 757ms remaining: 28.8ms
289: learn: 0.3493122 total: 760ms remaining: 26.2ms
290: learn: 0.3490836 total: 762ms remaining: 23.6ms
291: learn: 0.3489219 total: 765ms remaining: 20.9ms
292: learn: 0.3486748 total: 767ms remaining: 18.3ms
293: learn: 0.3484261 total: 769ms remaining: 15.7ms
294: learn: 0.3483319 total: 772ms remaining: 13.1ms
295: learn: 0.3480720 total: 774ms remaining: 10.5ms
296: learn: 0.3478700 total: 777ms remaining: 7.84ms
297: learn: 0.3477253 total: 779ms remaining: 5.23ms
298: learn: 0.3474961 total: 782ms remaining: 2.61ms
299: learn: 0.3473462 total: 784ms remaining: 0us
Best Parameters: {'depth': 6, 'iterations': 300, 'learning_rate': 0.05}
Test Accuracy: 0.7941
5.2 Model Tuning using Optuna¶
Optuna, an open-source hyperparameter optimization framework for Python, automates the process of finding optimal hyperparameters across various machine learning and deep learning frameworks [43]. Unlike GridSearch or Random Search methods, Optuna employs algorithms such as Bayesian optimization with Tree-structured Parzen Estimator (TPE) and Hyperband, enhancing computational efficiency by prioritizing promising regions of the hyperparameter space and converging faster to the best hyperparameters [44][45].
Key features of Optuna include:
- Utilization of Bayesian optimization algorithms with TPE and Hyperband for efficient hyperparameter search.
- Implementation of pruning strategies to terminate trials with low likelihood of yielding good results early in the process, accelerating results.
- Flexibility to define custom search spaces for hyperparameters.
- Provision of visualization features to facilitate understanding of hyperparameter performance.
- Control over performance methods.
optuna_modelsAccuracy = []
5.2.1 Logistic Regression¶
#define objective function for hyperparameter optimization using optuna
def objective_LogisticRegression(trial):
#define hyperparameters to optimize for
params = {
"C" : trial.suggest_loguniform('C', 1e-3, 1e3),
"tol" : trial.suggest_uniform('tol', 1e-6, 1e-3)
}
#create XGBClassifier model with optimized hyperparameters
model = LogisticRegression(**params, random_state=0)
#evaluate model using cross-validation
#Depending on what features will be utilized replace X with the corresponding dataframe
score = cross_val_score(model, X_FeatureSelection_LGM, y, cv=cv).mean()
return score
#run hyperparameter optimization with optuna
study = optuna.create_study(direction='maximize')
study.optimize(objective_LogisticRegression, n_trials=10)
[I 2024-04-08 03:15:21,376] A new study created in memory with name: no-name-706efee7-c392-4ad3-a8b5-d582776f9bc1
[I 2024-04-08 03:15:22,277] Trial 0 finished with value: 0.792133587584088 and parameters: {'C': 0.19331494013049183, 'tol': 0.000834903529503678}. Best is trial 0 with value: 0.792133587584088.
[I 2024-04-08 03:15:22,505] Trial 1 finished with value: 0.7851144128396631 and parameters: {'C': 0.008890565173813937, 'tol': 0.0008186298393214773}. Best is trial 0 with value: 0.792133587584088.
[I 2024-04-08 03:15:22,704] Trial 2 finished with value: 0.7791413740134541 and parameters: {'C': 0.0019676439266963467, 'tol': 0.0009296749638516682}. Best is trial 0 with value: 0.792133587584088.
[I 2024-04-08 03:15:23,032] Trial 3 finished with value: 0.7950119180041315 and parameters: {'C': 844.186224616539, 'tol': 0.0004129301773877483}. Best is trial 3 with value: 0.7950119180041315.
[I 2024-04-08 03:15:23,315] Trial 4 finished with value: 0.7910972509137136 and parameters: {'C': 0.06025838723781651, 'tol': 0.0008109147550478474}. Best is trial 3 with value: 0.7950119180041315.
[I 2024-04-08 03:15:23,590] Trial 5 finished with value: 0.7894867312887335 and parameters: {'C': 0.04600397072958776, 'tol': 0.0005387084977038127}. Best is trial 3 with value: 0.7950119180041315.
[I 2024-04-08 03:15:23,919] Trial 6 finished with value: 0.7952468351077917 and parameters: {'C': 80.9010834083927, 'tol': 0.0003318497428826118}. Best is trial 6 with value: 0.7952468351077917.
[I 2024-04-08 03:15:24,153] Trial 7 finished with value: 0.7861504846655013 and parameters: {'C': 0.013766036936262181, 'tol': 0.0008541586975940979}. Best is trial 6 with value: 0.7952468351077917.
[I 2024-04-08 03:15:24,455] Trial 8 finished with value: 0.792354732771863 and parameters: {'C': 0.17468172585243513, 'tol': 0.0002815281819825427}. Best is trial 6 with value: 0.7952468351077917.
[I 2024-04-08 03:15:24,771] Trial 9 finished with value: 0.7932793050479369 and parameters: {'C': 0.4843544395085441, 'tol': 0.0002899230709361639}. Best is trial 6 with value: 0.7952468351077917.
#get best hyperparameters
best_params_logitsticRegression = study.best_params
print(f'Best hyperparameters: {best_params_logitsticRegression}')
Best hyperparameters: {'C': 80.9010834083927, 'tol': 0.0003318497428826118}
#create XGBClassifier model with best hyperparameters
model_logisticRegression = LogisticRegression(**best_params_logitsticRegression, random_state=0)
#fit and predict using model
model_logisticRegression.fit(X_FeatureSelection_LGM, y)
predictions_XGBClassifier = model_logisticRegression.predict(test_FeatureSelection_LGM)
#scores = cross_val_score(model_logisticRegression, X_FeatureSelection_LGM, y, cv=StratifiedKFold(n_splits=20, shuffle = True), scoring='accuracy')
scorer = make_scorer(accuracy_score)
scores = cross_validate(model_logisticRegression, X_FeatureSelection_LGM, y, cv=cv, scoring=scorer, return_train_score=True)
optuna_modelsAccuracy.append(scores['test_score'].mean())
scores['test_score'].mean()
0.7951210339530697
cv_results = range(1, len(scores['train_score']) + 1)
train_average = np.mean(scores['train_score'])
test_average = np.mean(scores['test_score'])
plt.plot(cv_results, scores['train_score'], 'r',marker='o', label='Training Accuracy')
plt.plot(cv_results, scores['test_score'], 'g', marker='o', label='Test Accuracy')
plt.title('Logistic Regression Cross-Validation Accuracy')
plt.xlabel('Cross-Validation Fold')
plt.ylabel('Accuracy')
plt.legend()
plt.show()
print(f"train average : {train_average}")
print(f"test average : {test_average}")
train average : 0.7959822564233486 test average : 0.7951210339530697
5.2.2 Random Forest Classifier¶
#define objective function for hyperparameter optimization using optuna
def objective_RandomForest(trial):
#define hyperparameters to optimize for
params = {
"n_estimators" : trial.suggest_int('n_estimators', 50, 1000),
"max_depth" : trial.suggest_int('max_depth', 10, 100),
"min_samples_split" : trial.suggest_int('min_samples_split', 2, 20),
'min_samples_leaf': trial.suggest_int('min_samples_leaf', 2, 4),
}
#create XGBClassifier model with optimized hyperparameters
model = RandomForestClassifier(**params, random_state=0)
#evaluate model using cross-validation
#Depending on what features will be utilized replace X with the corresponding dataframe
score = cross_val_score(model, X_FeatureSelection_LGM, y, cv=cv).mean()
return score
#run hyperparameter optimization with optuna
study.optimize(objective_RandomForest, n_trials=10)
[I 2024-04-08 03:17:28,431] Trial 10 finished with value: 0.8008784893267652 and parameters: {'n_estimators': 847, 'max_depth': 49, 'min_samples_split': 4, 'min_samples_leaf': 4}. Best is trial 10 with value: 0.8008784893267652.
[I 2024-04-08 03:19:34,575] Trial 11 finished with value: 0.7996075003972669 and parameters: {'n_estimators': 870, 'max_depth': 52, 'min_samples_split': 3, 'min_samples_leaf': 4}. Best is trial 10 with value: 0.8008784893267652.
[I 2024-04-08 03:21:50,198] Trial 12 finished with value: 0.7991479951268605 and parameters: {'n_estimators': 924, 'max_depth': 54, 'min_samples_split': 3, 'min_samples_leaf': 4}. Best is trial 10 with value: 0.8008784893267652.
[I 2024-04-08 03:23:52,343] Trial 13 finished with value: 0.799033052598125 and parameters: {'n_estimators': 851, 'max_depth': 49, 'min_samples_split': 3, 'min_samples_leaf': 4}. Best is trial 10 with value: 0.8008784893267652.
[I 2024-04-08 03:25:08,136] Trial 14 finished with value: 0.8001957201122941 and parameters: {'n_estimators': 525, 'max_depth': 19, 'min_samples_split': 8, 'min_samples_leaf': 4}. Best is trial 10 with value: 0.8008784893267652.
[I 2024-04-08 03:25:52,686] Trial 15 finished with value: 0.7945502939774352 and parameters: {'n_estimators': 358, 'max_depth': 11, 'min_samples_split': 13, 'min_samples_leaf': 2}. Best is trial 10 with value: 0.8008784893267652.
[I 2024-04-08 03:27:03,510] Trial 16 finished with value: 0.7984546321309393 and parameters: {'n_estimators': 540, 'max_depth': 12, 'min_samples_split': 9, 'min_samples_leaf': 3}. Best is trial 10 with value: 0.8008784893267652.
[I 2024-04-08 03:28:21,563] Trial 17 finished with value: 0.7990364955770962 and parameters: {'n_estimators': 571, 'max_depth': 93, 'min_samples_split': 20, 'min_samples_leaf': 3}. Best is trial 10 with value: 0.8008784893267652.
[I 2024-04-08 03:28:44,424] Trial 18 finished with value: 0.7973128873351344 and parameters: {'n_estimators': 158, 'max_depth': 28, 'min_samples_split': 9, 'min_samples_leaf': 4}. Best is trial 10 with value: 0.8008784893267652.
[I 2024-04-08 03:30:59,475] Trial 19 finished with value: 0.7989247311827957 and parameters: {'n_estimators': 684, 'max_depth': 89, 'min_samples_split': 7, 'min_samples_leaf': 3}. Best is trial 10 with value: 0.8008784893267652.
#get best hyperparameters
best_params_randomforest = study.best_params
print(f'Best hyperparameters: {best_params_randomforest}')
Best hyperparameters: {'n_estimators': 847, 'max_depth': 49, 'min_samples_split': 4, 'min_samples_leaf': 4}
When utilizing Optuna to determine the optimal hyperparameters, following the same setup as in the LGBM and RFE methods, the best hyperparameters are found and printed above.
#create XGBClassifier model with best hyperparameters
model_RandomForestClassifier = RandomForestClassifier(**best_params_randomforest, random_state=0)
#fit and predict using model
model_RandomForestClassifier.fit(X_FeatureSelection_LGM, y)
predictions_XGBClassifier = model_RandomForestClassifier.predict(test_FeatureSelection_LGM)
#scores = cross_val_score(model_RandomForestClassifier, X_FeatureSelection_LGM, y, cv=StratifiedKFold(n_splits=10, shuffle = True), scoring='accuracy')
scores = cross_validate(model_RandomForestClassifier, X_FeatureSelection_LGM, y, cv=cv, scoring=scorer, return_train_score=True)
optuna_modelsAccuracy.append(scores['test_score'].mean())
scores['test_score'].mean()
0.7986972297261508
cv_results = range(1, len(scores['train_score']) + 1)
train_average = np.mean(scores['train_score'])
test_average = np.mean(scores['test_score'])
plt.plot(cv_results, scores['train_score'], 'r',marker='o', label='Training Accuracy')
plt.plot(cv_results, scores['test_score'], 'g', marker='o', label='Test Accuracy')
plt.title('Random Forest Cross-Validation Accuracy')
plt.xlabel('Cross-Validation Fold')
plt.ylabel('Accuracy')
plt.legend()
plt.show()
print(f"train average : {train_average}")
print(f"test average : {test_average}")
train average : 0.8508479473180743 test average : 0.7986972297261508
5.2.3 Naive Bayes¶
5.2.3.1 Gaussian Naive Bayes¶
import optuna
from sklearn.model_selection import cross_val_score
from sklearn.naive_bayes import GaussianNB
def objective_GNB(trial):
# Define the hyperparameters to optimize
var_smoothing = trial.suggest_loguniform('var_smoothing', 1e-10, 1e-1)
# Create the Gaussian Naive Bayes classifier with the hyperparameters
model = GaussianNB(var_smoothing=var_smoothing)
# Evaluate the model using cross-validation
score = cross_val_score(model, X_FeatureSelection_LGM, y, cv=cv).mean()
return score
# Create an Optuna study and optimize the objective function
study = optuna.create_study(direction='maximize')
study.optimize(objective_GNB, n_trials=10)
[I 2024-04-08 03:34:05,531] A new study created in memory with name: no-name-9bf432fd-bae1-492b-986f-3201eaf112ca
[I 2024-04-08 03:34:05,632] Trial 0 finished with value: 0.708163303141056 and parameters: {'var_smoothing': 1.1425773099782466e-08}. Best is trial 0 with value: 0.708163303141056.
[I 2024-04-08 03:34:05,723] Trial 1 finished with value: 0.7064333386302241 and parameters: {'var_smoothing': 6.849192084216105e-09}. Best is trial 0 with value: 0.708163303141056.
[I 2024-04-08 03:34:05,814] Trial 2 finished with value: 0.7062037184172891 and parameters: {'var_smoothing': 0.00011631667026443376}. Best is trial 0 with value: 0.708163303141056.
[I 2024-04-08 03:34:05,905] Trial 3 finished with value: 0.7058649822554159 and parameters: {'var_smoothing': 1.8553434714592924e-08}. Best is trial 0 with value: 0.708163303141056.
[I 2024-04-08 03:34:05,999] Trial 4 finished with value: 0.6999912601303035 and parameters: {'var_smoothing': 0.009584908329675746}. Best is trial 0 with value: 0.708163303141056.
[I 2024-04-08 03:34:06,077] Trial 5 finished with value: 0.7066587213305789 and parameters: {'var_smoothing': 1.8685276725028394e-09}. Best is trial 0 with value: 0.708163303141056.
[I 2024-04-08 03:34:06,153] Trial 6 finished with value: 0.7079238836802797 and parameters: {'var_smoothing': 3.5115054807164617e-07}. Best is trial 0 with value: 0.708163303141056.
[I 2024-04-08 03:34:06,229] Trial 7 finished with value: 0.7059682716245563 and parameters: {'var_smoothing': 0.0006278527322799388}. Best is trial 0 with value: 0.708163303141056.
[I 2024-04-08 03:34:06,301] Trial 8 finished with value: 0.7074672916997723 and parameters: {'var_smoothing': 3.633305496592066e-10}. Best is trial 0 with value: 0.708163303141056.
[I 2024-04-08 03:34:06,381] Trial 9 finished with value: 0.7064285714285714 and parameters: {'var_smoothing': 4.060364818078787e-05}. Best is trial 0 with value: 0.708163303141056.
#get best hyperparameters
best_params_GNB = study.best_params
best_accuracy_GNB = study.best_value
print(f'Best hyperparameters: {best_params_GNB}')
Best hyperparameters: {'var_smoothing': 1.1425773099782466e-08}
When employing Optuna to identify the optimal hyperparameters, following the same setup as in the LGBM and RFE methods, the best hyperparameter is found.
#create XGBClassifier model with best hyperparameters
model_GNB = GaussianNB(**best_params_GNB)
#fit and predict using model
model_GNB.fit(X_FeatureSelection_LGM, y)
predictions_GNBClassifier = model_GNB.predict(test_FeatureSelection_LGM)
#scores = cross_val_score(model_logisticRegression, X_FeatureSelection_LGM, y, cv=StratifiedKFold(n_splits=20, shuffle = True), scoring='accuracy')
scorer = make_scorer(accuracy_score)
scores = cross_validate(model_GNB, X_FeatureSelection_LGM, y, cv=cv, scoring=scorer, return_train_score=True)
optuna_modelsAccuracy.append(scores['test_score'].mean())
scores['test_score'].mean()
0.7070075215848296
cv_results = range(1, len(scores['train_score']) + 1)
train_average = np.mean(scores['train_score'])
test_average = np.mean(scores['test_score'])
plt.plot(cv_results, scores['train_score'], 'r',marker='o', label='Training Accuracy')
plt.plot(cv_results, scores['test_score'], 'g', marker='o', label='Test Accuracy')
plt.title('Gaussian Naive Bayes Cross-Validation Accuracy')
plt.xlabel('Cross-Validation Fold')
plt.ylabel('Accuracy')
plt.legend()
plt.show()
print(f"train average : {train_average}")
print(f"test average : {test_average}")
train average : 0.7071994073793603 test average : 0.7070075215848296
5.2.3.2 Bernoulli Naive Bayes¶
import optuna
from sklearn.model_selection import cross_val_score
from sklearn.naive_bayes import BernoulliNB
def objective_BNB(trial):
# Define the hyperparameters to optimize
alpha = trial.suggest_loguniform('alpha', 1e-10, 1e-1)
binarize = trial.suggest_uniform('binarize', 0.0, 1.0)
fit_prior = trial.suggest_categorical('fit_prior', [True, False])
# Create the Bernoulli Naive Bayes classifier with the hyperparameters
model = BernoulliNB(alpha=alpha, binarize=binarize, fit_prior=fit_prior)
# Evaluate the model using cross-validation
score = cross_val_score(model, X_FeatureSelection_LGM, y, cv=cv).mean()
return score
# Create an Optuna study and optimize the objective function
study = optuna.create_study(direction='maximize')
study.optimize(objective_BNB, n_trials=10)
# Get the best hyperparameters and the best accuracy
best_params = study.best_params
best_accuracy = study.best_value
[I 2024-04-08 03:34:06,703] A new study created in memory with name: no-name-69048f52-64a2-4486-b967-d63dadc0405b
[I 2024-04-08 03:34:06,811] Trial 0 finished with value: 0.7426754595052703 and parameters: {'alpha': 1.8673870862698827e-10, 'binarize': 0.36464781241087096, 'fit_prior': True}. Best is trial 0 with value: 0.7426754595052703.
[I 2024-04-08 03:34:06,910] Trial 1 finished with value: 0.748998093119339 and parameters: {'alpha': 0.00013753145737869973, 'binarize': 0.19073376401728848, 'fit_prior': False}. Best is trial 1 with value: 0.748998093119339.
[I 2024-04-08 03:34:07,016] Trial 2 finished with value: 0.742890248424175 and parameters: {'alpha': 3.301817855293904e-05, 'binarize': 0.3171726563853191, 'fit_prior': False}. Best is trial 1 with value: 0.748998093119339.
[I 2024-04-08 03:34:07,115] Trial 3 finished with value: 0.742091212458287 and parameters: {'alpha': 6.736315387792665e-06, 'binarize': 0.6414837596521008, 'fit_prior': True}. Best is trial 1 with value: 0.748998093119339.
[I 2024-04-08 03:34:07,216] Trial 4 finished with value: 0.7419850097992479 and parameters: {'alpha': 1.553971767310266e-08, 'binarize': 0.8104503413541061, 'fit_prior': False}. Best is trial 1 with value: 0.748998093119339.
[I 2024-04-08 03:34:07,326] Trial 5 finished with value: 0.7426579797658773 and parameters: {'alpha': 0.002918741791422071, 'binarize': 0.9546490858524869, 'fit_prior': False}. Best is trial 1 with value: 0.748998093119339.
[I 2024-04-08 03:34:07,429] Trial 6 finished with value: 0.7415077599449122 and parameters: {'alpha': 0.0023032750895559725, 'binarize': 0.8744375633814825, 'fit_prior': True}. Best is trial 1 with value: 0.748998093119339.
[I 2024-04-08 03:34:07,526] Trial 7 finished with value: 0.7411753800519095 and parameters: {'alpha': 0.00023720731526559466, 'binarize': 0.6143120117935071, 'fit_prior': False}. Best is trial 1 with value: 0.748998093119339.
[I 2024-04-08 03:34:07,645] Trial 8 finished with value: 0.7476156046400763 and parameters: {'alpha': 4.9159686514648905e-08, 'binarize': 0.17174476915376102, 'fit_prior': True}. Best is trial 1 with value: 0.748998093119339.
[I 2024-04-08 03:34:07,755] Trial 9 finished with value: 0.741518618570899 and parameters: {'alpha': 1.7744624182424156e-05, 'binarize': 0.9799541614716065, 'fit_prior': False}. Best is trial 1 with value: 0.748998093119339.
#get best hyperparameters
best_params_BNB = study.best_params
best_accuracy_BNB = study.best_value
print(f'Best hyperparameters: {best_params_BNB}')
Best hyperparameters: {'alpha': 0.00013753145737869973, 'binarize': 0.19073376401728848, 'fit_prior': False}
When utilizing Optuna to ascertain the best hyperparameters, following the same configuration as in the LGBM and RFE methods, the optimal hyperparameters are identified above.
#create XGBClassifier model with best hyperparameters
model_BNB = BernoulliNB(**best_params_BNB)
#fit and predict using model
model_BNB.fit(X_FeatureSelection_LGM, y)
predictions_GNBClassifier = model_BNB.predict(test_FeatureSelection_LGM)
#scores = cross_val_score(model_logisticRegression, X_FeatureSelection_LGM, y, cv=StratifiedKFold(n_splits=20, shuffle = True), scoring='accuracy')
scorer = make_scorer(accuracy_score)
scores = cross_validate(model_BNB, X_FeatureSelection_LGM, y, cv=cv, scoring=scorer, return_train_score=True)
optuna_modelsAccuracy.append(scores['test_score'].mean())
scores['test_score'].mean()
0.7484270882991685
cv_results = range(1, len(scores['train_score']) + 1)
train_average = np.mean(scores['train_score'])
test_average = np.mean(scores['test_score'])
plt.plot(cv_results, scores['train_score'], 'r',marker='o', label='Training Accuracy')
plt.plot(cv_results, scores['test_score'], 'g', marker='o', label='Test Accuracy')
plt.title('Bernoulli Naive Bayes Cross-Validation Accuracy')
plt.xlabel('Cross-Validation Fold')
plt.ylabel('Accuracy')
plt.legend()
plt.show()
print(f"train average : {train_average}")
print(f"test average : {test_average}")
train average : 0.748690738192622 test average : 0.7484270882991685
5.2.4 XGBClassifier¶
#define objective function for hyperparameter optimization using optuna
def objective_XGBoost(trial):
#define hyperparameters to optimize for
params = {
'n_estimators': trial.suggest_int('n_estimators', 50, 1000),
'max_depth': trial.suggest_int('max_depth', 10, 100),
'learning_rate': trial.suggest_loguniform('learning_rate', 0.005, 1),
'subsample': trial.suggest_uniform('subsample', 0.1, 1),
'colsample_bytree': trial.suggest_uniform('colsample_bytree', 0.5, 1),
#'gamma': trial.suggest_uniform('gamma', 0, 1),
'alpha': trial.suggest_loguniform('alpha', 2, 5),
'lambda': trial.suggest_loguniform('lambda', 2, 5),
'min_child_weight': trial.suggest_int('min_child_weight', 1, 300)
}
#create XGBClassifier model with optimized hyperparameters
model = XGBClassifier(**params, random_state=0)
#evaluate model using cross-validation
#Depending on what features will be utilized replace X with the corresponding dataframe
score = cross_val_score(model, X_FeatureSelection_LGM, y, cv=cv)
return score
#run hyperparameter optimization with optuna
study.optimize(objective_XGBoost, n_trials=5)
[W 2024-04-08 03:34:15,660] Trial 10 failed with parameters: {'n_estimators': 836, 'max_depth': 30, 'learning_rate': 0.016810386071667384, 'subsample': 0.7690788740115982, 'colsample_bytree': 0.5552863954783018, 'alpha': 4.4448396844476825, 'lambda': 3.616565525409359, 'min_child_weight': 195} because of the following error: The value array([0.77471264, 0.76321839, 0.8 , 0.7862069 , 0.77701149,
0.78850575, 0.79770115, 0.7954023 , 0.78390805, 0.77931034,
0.77471264, 0.75172414, 0.77241379, 0.78341014, 0.73963134,
0.75576037, 0.81336406, 0.77419355, 0.79262673, 0.75345622]) could not be cast to float.
[W 2024-04-08 03:34:15,661] Trial 10 failed with value array([0.77471264, 0.76321839, 0.8 , 0.7862069 , 0.77701149,
0.78850575, 0.79770115, 0.7954023 , 0.78390805, 0.77931034,
0.77471264, 0.75172414, 0.77241379, 0.78341014, 0.73963134,
0.75576037, 0.81336406, 0.77419355, 0.79262673, 0.75345622]).
[W 2024-04-08 03:34:18,250] Trial 11 failed with parameters: {'n_estimators': 384, 'max_depth': 23, 'learning_rate': 0.132195283408719, 'subsample': 0.9258793982518616, 'colsample_bytree': 0.6667357980427393, 'alpha': 4.410923353730855, 'lambda': 3.0991522566426504, 'min_child_weight': 280} because of the following error: The value array([0.79770115, 0.78390805, 0.75862069, 0.79770115, 0.76781609,
0.77471264, 0.77931034, 0.77931034, 0.7908046 , 0.7954023 ,
0.80689655, 0.74712644, 0.77241379, 0.76958525, 0.74654378,
0.76958525, 0.79953917, 0.74193548, 0.77419355, 0.77880184]) could not be cast to float.
[W 2024-04-08 03:34:18,251] Trial 11 failed with value array([0.79770115, 0.78390805, 0.75862069, 0.79770115, 0.76781609,
0.77471264, 0.77931034, 0.77931034, 0.7908046 , 0.7954023 ,
0.80689655, 0.74712644, 0.77241379, 0.76958525, 0.74654378,
0.76958525, 0.79953917, 0.74193548, 0.77419355, 0.77880184]).
[W 2024-04-08 03:34:21,109] Trial 12 failed with parameters: {'n_estimators': 357, 'max_depth': 13, 'learning_rate': 0.05796185052473388, 'subsample': 0.5080520764385198, 'colsample_bytree': 0.7995676897058254, 'alpha': 3.778358278652151, 'lambda': 3.253128194448363, 'min_child_weight': 130} because of the following error: The value array([0.79310345, 0.76551724, 0.7908046 , 0.80689655, 0.76781609,
0.7862069 , 0.77241379, 0.7954023 , 0.77931034, 0.77011494,
0.82068966, 0.77931034, 0.78850575, 0.77880184, 0.7764977 ,
0.7764977 , 0.76036866, 0.81105991, 0.76267281, 0.75345622]) could not be cast to float.
[W 2024-04-08 03:34:21,110] Trial 12 failed with value array([0.79310345, 0.76551724, 0.7908046 , 0.80689655, 0.76781609,
0.7862069 , 0.77241379, 0.7954023 , 0.77931034, 0.77011494,
0.82068966, 0.77931034, 0.78850575, 0.77880184, 0.7764977 ,
0.7764977 , 0.76036866, 0.81105991, 0.76267281, 0.75345622]).
[W 2024-04-08 03:34:29,892] Trial 13 failed with parameters: {'n_estimators': 842, 'max_depth': 20, 'learning_rate': 0.18415198829132823, 'subsample': 0.43767948632150794, 'colsample_bytree': 0.5819220709524189, 'alpha': 4.459590935218685, 'lambda': 3.31377006843231, 'min_child_weight': 17} because of the following error: The value array([0.7908046 , 0.80229885, 0.82528736, 0.78390805, 0.8091954 ,
0.77471264, 0.80689655, 0.8 , 0.77011494, 0.79770115,
0.7816092 , 0.77931034, 0.79310345, 0.79723502, 0.81105991,
0.82488479, 0.7764977 , 0.78571429, 0.81797235, 0.82258065]) could not be cast to float.
[W 2024-04-08 03:34:29,893] Trial 13 failed with value array([0.7908046 , 0.80229885, 0.82528736, 0.78390805, 0.8091954 ,
0.77471264, 0.80689655, 0.8 , 0.77011494, 0.79770115,
0.7816092 , 0.77931034, 0.79310345, 0.79723502, 0.81105991,
0.82488479, 0.7764977 , 0.78571429, 0.81797235, 0.82258065]).
[W 2024-04-08 03:34:38,933] Trial 14 failed with parameters: {'n_estimators': 871, 'max_depth': 68, 'learning_rate': 0.2877529114001428, 'subsample': 0.5218326277836934, 'colsample_bytree': 0.9104572256893584, 'alpha': 4.4242133621967, 'lambda': 2.347402581239824, 'min_child_weight': 33} because of the following error: The value array([0.78390805, 0.79310345, 0.78390805, 0.77701149, 0.82988506,
0.82068966, 0.81149425, 0.79310345, 0.78850575, 0.80229885,
0.80689655, 0.76551724, 0.8091954 , 0.78571429, 0.79032258,
0.77419355, 0.80414747, 0.80875576, 0.78801843, 0.78341014]) could not be cast to float.
[W 2024-04-08 03:34:38,933] Trial 14 failed with value array([0.78390805, 0.79310345, 0.78390805, 0.77701149, 0.82988506,
0.82068966, 0.81149425, 0.79310345, 0.78850575, 0.80229885,
0.80689655, 0.76551724, 0.8091954 , 0.78571429, 0.79032258,
0.77419355, 0.80414747, 0.80875576, 0.78801843, 0.78341014]).
#get best hyperparameters
best_params_xgbClassifer = study.best_params
print(f'Best hyperparameters: {best_params_xgbClassifer}')
Best hyperparameters: {'alpha': 0.00013753145737869973, 'binarize': 0.19073376401728848, 'fit_prior': False}
#create XGBClassifier model with best hyperparameters
model_XGBClassifier = XGBClassifier(**best_params_xgbClassifer, random_state=0)
#fit and predict using model
model_XGBClassifier.fit(X_FeatureSelection_LGM, y)
predictions_XGBClassifier = model_XGBClassifier.predict(test_FeatureSelection_LGM)
#scores = cross_val_score(model_XGBClassifier, X_FeatureSelection_LGM, y, cv=StratifiedKFold(n_splits=20, shuffle = True), scoring='accuracy')
scores = cross_validate(model_XGBClassifier, X_FeatureSelection_LGM, y, cv=cv, scoring=scorer, return_train_score=True)
optuna_modelsAccuracy.append(scores['test_score'].mean())
scores['test_score'].mean()
0.8017914084432437
cv_results = range(1, len(scores['train_score']) + 1)
train_average = np.mean(scores['train_score'])
test_average = np.mean(scores['test_score'])
plt.plot(cv_results, scores['train_score'], 'r',marker='o', label='Training Accuracy')
plt.plot(cv_results, scores['test_score'], 'g', marker='o', label='Test Accuracy')
print(f"train average : {train_average}")
print(f"test average : {test_average}")
plt.title('XGBoost Cross-Validation Accuracy')
plt.xlabel('Cross-Validation Fold')
plt.ylabel('Accuracy')
plt.legend()
plt.show()
train average : 0.8821859174683416 test average : 0.8017914084432437
5.2.5 CatBoostClassifier¶
def objective_CatboostClassfier(trial):
#define hyperparameters to optimize for
params = {
'iterations': trial.suggest_int('iterations', 100, 1000),
'learning_rate': trial.suggest_loguniform('learning_rate', 0.005, 1),
'depth': trial.suggest_int('depth', 3, 10),
'loss_function': 'MultiClass'
}
#create XGBClassifier model with optimized hyperparameters
model = CatBoostClassifier(**params, random_state=0)
#evaluate model using cross-validation
score = cross_val_score(model, X_FeatureSelection_LGM, y, cv=cv).mean()
return score
%%capture
study.optimize(objective_CatboostClassfier, n_trials=10)
print(f'Best hyperparameters: {study.best_params}')
[I 2024-04-08 03:34:46,388] Trial 15 finished with value: 0.7973102388897717 and parameters: {'iterations': 108, 'learning_rate': 0.07016335800526678, 'depth': 5}. Best is trial 15 with value: 0.7973102388897717.
[I 2024-04-08 03:34:51,609] Trial 16 finished with value: 0.801569203877324 and parameters: {'iterations': 111, 'learning_rate': 0.0693467143375084, 'depth': 5}. Best is trial 16 with value: 0.801569203877324.
[I 2024-04-08 03:34:58,199] Trial 17 finished with value: 0.8027133322739551 and parameters: {'iterations': 126, 'learning_rate': 0.07015836327665975, 'depth': 5}. Best is trial 17 with value: 0.8027133322739551.
[I 2024-04-08 03:35:07,140] Trial 18 finished with value: 0.8045537369564066 and parameters: {'iterations': 151, 'learning_rate': 0.07651922563708288, 'depth': 5}. Best is trial 18 with value: 0.8045537369564066.
[I 2024-04-08 03:35:22,196] Trial 19 finished with value: 0.8023669156205306 and parameters: {'iterations': 414, 'learning_rate': 0.6650806485308475, 'depth': 4}. Best is trial 18 with value: 0.8045537369564066.
[I 2024-04-08 03:39:21,294] Trial 20 finished with value: 0.8034011335346152 and parameters: {'iterations': 930, 'learning_rate': 0.0074741102148396495, 'depth': 9}. Best is trial 18 with value: 0.8045537369564066.
[I 2024-04-08 03:50:28,469] Trial 21 finished with value: 0.8023674453096034 and parameters: {'iterations': 999, 'learning_rate': 0.00742236144255235, 'depth': 10}. Best is trial 18 with value: 0.8045537369564066.
[I 2024-04-08 03:54:57,311] Trial 22 finished with value: 0.8000643572223105 and parameters: {'iterations': 919, 'learning_rate': 0.005215636459560014, 'depth': 9}. Best is trial 18 with value: 0.8045537369564066.
[I 2024-04-08 03:56:10,071] Trial 23 finished with value: 0.8084731712484772 and parameters: {'iterations': 660, 'learning_rate': 0.021154745010366478, 'depth': 8}. Best is trial 23 with value: 0.8084731712484772.
[I 2024-04-08 03:56:56,252] Trial 24 finished with value: 0.8076630118120663 and parameters: {'iterations': 628, 'learning_rate': 0.019079022427720024, 'depth': 7}. Best is trial 23 with value: 0.8084731712484772.
best_params_catboost = study.best_params
print("Best Hyperparameters:", best_params_catboost)
Best Hyperparameters: {'iterations': 660, 'learning_rate': 0.021154745010366478, 'depth': 8}
%%capture
model_CatBoostClassifier = CatBoostClassifier(**best_params_catboost, random_state=0)
model_CatBoostClassifier.fit(X_FeatureSelection_LGM,y)
#score = cross_val_score(model_CatBoostClassifier, X_FeatureSelection_LGM, y, cv=StratifiedKFold(n_splits=10, shuffle = True)).mean()
scores = cross_validate(model_XGBClassifier, X_FeatureSelection_LGM, y, cv=cv, scoring=scorer, return_train_score=True)
optuna_modelsAccuracy.append(scores['test_score'].mean())
scores['test_score'].mean()
cv_results = range(1, len(scores['train_score']) + 1)
train_average = np.mean(scores['train_score'])
test_average = np.mean(scores['test_score'])
plt.plot(cv_results, scores['train_score'], 'r',marker='o', label='Training Accuracy')
plt.plot(cv_results, scores['test_score'], 'g', marker='o', label='Test Accuracy')
plt.title('CatBoost Cross-Validation Accuracy')
plt.xlabel('Cross-Validation Fold')
plt.ylabel('Accuracy')
plt.legend()
plt.show()
print(f"train average : {train_average}")
print(f"test average : {test_average}")
train average : 0.8820102759091111 test average : 0.8034048413581228
5.3 Model tuning using RandomSearch¶
RandomSearch is a hyperparameter tuning algorithm that randomly selects hyperparameter combinations from a predefined space, offering a faster alternative to Grid Search when dealing with numerous hyperparameters [46][47]. Its advantages include the potential to discover unexpected yet effective hyperparameter combinations and a reduced risk of model overfitting [47]. However, its randomness can introduce significant variance in results, and there’s a possibility of missing the optimal hyperparameter set if the number of iterations is insufficient [47].
randomSearch_modelsAccuracy = []
5.3.1 Logistic Regression¶
model = LogisticRegression()
hyperparameters = {
'penalty': ['l1', 'l2', 'elasticnet', 'none'],
'C': uniform(loc=0, scale=4),
'solver': ['newton-cg', 'lbfgs', 'liblinear', 'sag', 'saga'],
'max_iter': randint(100, 500),
'multi_class': ['auto', 'ovr', 'multinomial'],
# Add other hyperparameters here
}
# Set up the random search with cross-validation
random_search = RandomizedSearchCV(
estimator=model,
param_distributions=hyperparameters,
n_iter=5,
cv=cv
)
scores = cross_validate(random_search, X_FeatureSelection_LGM, y, cv=cv, scoring=scorer, return_train_score=True)
randomSearch_modelsAccuracy.append(scores['test_score'].mean())
#score = cross_val_score(random_search, X_FeatureSelection_LGM, y, cv=StratifiedKFold(n_splits=10, shuffle = True)).mean()
#randomSearch_modelsAccuracy.append(score)
cv_results = range(1, len(scores['train_score']) + 1)
train_average = np.mean(scores['train_score'])
test_average = np.mean(scores['test_score'])
plt.plot(cv_results, scores['train_score'], 'r',marker='o', label='Training Accuracy')
plt.plot(cv_results, scores['test_score'], 'g', marker='o', label='Test Accuracy')
plt.title('Logistic Regression Cross-Validation Accuracy')
plt.xlabel('Cross-Validation Fold')
plt.ylabel('Accuracy')
plt.legend()
plt.show()
print(f"train average : {train_average}")
print(f"test average : {test_average}")
train average : 0.795879336195209 test average : 0.7948977700090046
5.3.2 Random Forest Classifier¶
model = RandomForestClassifier()
param_distributions = {
'n_estimators': randint(100, 2000), # Number of trees in the forest
'max_features': ['auto', 'sqrt', 'log2'], # Number of features to consider at every split
'max_depth': randint(10, 100), # Maximum number of levels in tree
'min_samples_split': randint(2, 20), # Minimum number of samples required to split a node
'min_samples_leaf': randint(1, 20), # Minimum number of samples required at each leaf node
'bootstrap': [True, False], # Method of selecting samples for training each tree
'criterion': ['gini', 'entropy'], # Function to measure the quality of a split
'class_weight': [None, 'balanced', 'balanced_subsample'], # Weights associated with classes
}
random_search = RandomizedSearchCV(
estimator=model,
param_distributions=param_distributions,
n_iter=5, # Number of parameter settings to sample
cv=cv,
n_jobs=-1 # Use all available cores
)
scores = cross_validate(random_search, X_FeatureSelection_LGM, y, cv=cv, scoring=scorer, return_train_score=True)
randomSearch_modelsAccuracy.append(scores['test_score'].mean())
# score = cross_val_score(random_search, X_FeatureSelection_LGM, y, cv=StratifiedKFold(n_splits=10, shuffle = True)).mean()
cv_results = range(1, len(scores['train_score']) + 1)
train_average = np.mean(scores['train_score'])
test_average = np.mean(scores['test_score'])
plt.plot(cv_results, scores['train_score'], 'r',marker='o', label='Training Accuracy')
plt.plot(cv_results, scores['test_score'], 'g', marker='o', label='Test Accuracy')
plt.title('Random Forest Cross-Validation Accuracy')
plt.xlabel('Cross-Validation Fold')
plt.ylabel('Accuracy')
plt.legend()
plt.show()
print(f"train average : {train_average}")
print(f"test average : {test_average}")
train average : 0.8652581017835302 test average : 0.7958186344615711
5.3.3 Naive Bayes Classifier¶
5.3.3.1 Gaussian Naive Bayes¶
from sklearn.naive_bayes import GaussianNB
model = GaussianNB()
param_distributions = {
'var_smoothing': [1e-9, 1e-8, 1e-7] # Smoothing parameter for variance
}
random_search = RandomizedSearchCV(
estimator=model,
param_distributions=param_distributions,
n_iter=5, # Number of parameter settings to sample
cv=cv,
n_jobs=-1 # Use all available cores
)
scores = cross_validate(random_search, X_FeatureSelection_LGM, y, cv=cv, scoring=scorer, return_train_score=True)
randomSearch_modelsAccuracy.append(scores['test_score'].mean())
cv_results = range(1, len(scores['train_score']) + 1)
train_average = np.mean(scores['train_score'])
test_average = np.mean(scores['test_score'])
plt.plot(cv_results, scores['train_score'], 'r',marker='o', label='Training Accuracy')
plt.plot(cv_results, scores['test_score'], 'g', marker='o', label='Test Accuracy')
plt.title('Gaussian Naive Bayes Cross-Validation Accuracy')
plt.xlabel('Cross-Validation Fold')
plt.ylabel('Accuracy')
plt.legend()
plt.show()
print(f"train average : {train_average}")
print(f"test average : {test_average}")
train average : 0.7072720656632068 test average : 0.7067781662164309
5.3.3.2 Bernoulli Naive Bayes¶
from sklearn.naive_bayes import BernoulliNB
model = BernoulliNB()
param_distributions = {
'alpha': [0.01, 0.1, 1.0, 10.0, 100.0], # Additive (Laplace/Lidstone) smoothing parameter
'binarize': [0.0, 0.5, 1.0], # Threshold for binarizing of sample features
'fit_prior': [True, False] # Whether to learn class prior probabilities or not
}
random_search = RandomizedSearchCV(
estimator=model,
param_distributions=param_distributions,
n_iter=5, # Number of parameter settings to sample
cv=cv,
n_jobs=-1 # Use all available cores
)
scores = cross_validate(random_search, X_FeatureSelection_LGM, y, cv=cv, scoring=scorer, return_train_score=True)
randomSearch_modelsAccuracy.append(scores['test_score'].mean())
cv_results = range(1, len(scores['train_score']) + 1)
train_average = np.mean(scores['train_score'])
test_average = np.mean(scores['test_score'])
plt.plot(cv_results, scores['train_score'], 'r',marker='o', label='Training Accuracy')
plt.plot(cv_results, scores['test_score'], 'g', marker='o', label='Test Accuracy')
plt.title('Bernoulli Naive Bayes Cross-Validation Accuracy')
plt.xlabel('Cross-Validation Fold')
plt.ylabel('Accuracy')
plt.legend()
plt.show()
print(f"train average : {train_average}")
print(f"test average : {test_average}")
train average : 0.7457966717271611 test average : 0.7447476031569469
5.3.4 XGBoostClassifier¶
model = XGBClassifier()
xgb_param_distributions = {
'n_estimators': randint(100, 1000),
'max_depth': randint(3, 10),
'learning_rate': uniform(0.01, 0.6),
'subsample': uniform(0.3, 0.7),
'colsample_bytree': uniform(0.5, 0.9),
'min_child_weight': randint(1, 10),
'gamma': uniform(0, 0.5),
'reg_lambda': uniform(1, 4),
}
xgb_random_search = RandomizedSearchCV(
estimator=model,
param_distributions=xgb_param_distributions,
n_iter=5,
cv=cv,
n_jobs=-1
)
scores = cross_validate(xgb_random_search, X_FeatureSelection_LGM, y, cv=cv, scoring=scorer, return_train_score=True)
randomSearch_modelsAccuracy.append(scores['test_score'].mean())
cv_results = range(1, len(scores['train_score']) + 1)
train_average = np.mean(scores['train_score'])
test_average = np.mean(scores['test_score'])
plt.plot(cv_results, scores['train_score'], 'b',marker='o', label='Training Accuracy')
plt.plot(cv_results, scores['test_score'], 'g', marker='o', label='Test Accuracy')
plt.title('XGBoost Cross-Validation Accuracy')
plt.xlabel('Cross-Validation Fold')
plt.ylabel('Accuracy')
plt.legend()
plt.show()
print(f"train average : {train_average}")
print(f"test average : {test_average}")
train average : 0.8754588153258526 test average : 0.8004176598336776
5.3.5 CatBoostClassifier¶
%%capture
model = CatBoostClassifier()
catboost_param_distributions = {
'iterations': randint(100, 1000),
'depth': randint(4, 10),
'learning_rate': uniform(0.01, 0.3),
'random_strength': randint(1, 10),
'bagging_temperature': uniform(0.0, 1.0),
'l2_leaf_reg': randint(1, 10),
'border_count': randint(1, 255),
'scale_pos_weight': uniform(0.01, 1.0),
}
catboost_random_search = RandomizedSearchCV(
estimator=model,
param_distributions=catboost_param_distributions,
n_iter=5,
cv=cv,
n_jobs=-1
)
scores = cross_validate(catboost_random_search, X_FeatureSelection_LGM, y, cv=cv, scoring=scorer, return_train_score=True)
randomSearch_modelsAccuracy.append(scores['test_score'].mean())
cv_results = range(1, len(scores['train_score']) + 1)
train_average = np.mean(scores['train_score'])
test_average = np.mean(scores['test_score'])
plt.plot(cv_results, scores['train_score'], 'b',marker='o', label='Training Accuracy')
plt.plot(cv_results, scores['test_score'], 'g', marker='o', label='Test Accuracy')
plt.title('CatBoost Cross-Validation Accuracy')
plt.xlabel('Cross-Validation Fold')
plt.ylabel('Accuracy')
plt.legend()
plt.show()
print(f"train average : {train_average}")
print(f"test average : {test_average}")
train average : 0.8681213938625592 test average : 0.7976563906986599
5.4 Comparison of GridSearch and Optuna¶
def tabulate_vectors(vector1, vector2, vector3,vector4):
# Determine the maximum width for the model names
max_width = max(len(name) for name in vector1) + 2
# Print the header with fixed spacing
print(f"{'Model'.ljust(max_width)}\t{'GridSearch Accuracy'.ljust(20)}\t{'Random Search'.ljust(20)}\t{'Optuna Accuracy'.ljust(20)}")
# Print each row of the combined vectors with fixed spacing
for row in zip(vector1, vector2, vector3,vector4):
print(f"{row[0].ljust(max_width)}\t{str(row[1]).ljust(20)}\t{str(row[2]).ljust(20)}\t{str(row[3]).ljust(20)}")
tabulate_vectors(["Logistic Regression", "Random Forest", "Gaussian Naive Bayes", "Bernoulli Naive Bayes", "XGBoost", "CatBoost"], gridSearch_modelsAccuracy, randomSearch_modelsAccuracy, optuna_modelsAccuracy)
Model GridSearch Accuracy Random Search Optuna Accuracy Logistic Regression 0.7855089131684876 0.7948977700090046 0.7951210339530697 Random Forest 0.7826336975273146 0.7958186344615711 0.7986972297261508 Gaussian Naive Bayes 0.7078780908568143 0.7067781662164309 0.7070075215848296 Bernoulli Naive Bayes 0.7487061529614721 0.7447476031569469 0.7484270882991685 XGBoost 0.7866589994249569 0.8004176598336776 0.8017914084432437 CatBoost 0.7941345600920069 0.7976563906986599 0.8034048413581228
The comparison between the Optuna and GridSearch hyperparameter optimization algorithms indicates that Optuna converges to better hyperparameters. The models produced by Optuna exhibit slightly better performance in terms of accuracy compared to those generated by the GridSearch algorithm and Random Search. Hence, Optuna will be employed due to its ability to achieve superior hyperparameters with fewer trials and less computational power when compared to the other methods.
6. Model Selection¶
In addition to Optuna, CatBoost is chosen as the algorithm for model selection. The decision to utilize CatBoost alongside Optuna is supported by its superior performance compared to other models.
As shown in the table above, CatBoost consistently achieves higher accuracy scores compared to other models, both with GridSearch and Optuna. For instance, the accuracy of CatBoost with Optuna optimization is notably higher than that of Logistic Regression, Random Forest, and XGBoost. This superior performance makes CatBoost an ideal candidate for the model selection process.
Therefore, the combination of Optuna with CatBoost is chosen as the preferred approach for solving the model selection task, given its ability to deliver superior accuracy and robust performance across various datasets.
sample = pd.read_csv("sample_submission.csv")
sample['Transported'] = model_CatBoostClassifier.predict(test_df)
#This converts the numbers to True/False values
sample['Transported']=sample['Transported'].astype(bool)
sample.to_csv('submission.csv', index=False)
display.Image("bestEntry.png")
display.Image("results.png")
Conclusion¶
The notebook highlights the selection of CatBoost alongside Optuna for model selection, emphasizing CatBoost's superior performance compared to other models. The table provided illustrates CatBoost's consistently higher accuracy scores, both with GridSearch and Optuna, outperforming Logistic Regression, Random Forest, Naïve Bayes and XGBoost. This superior performance positions CatBoost as an ideal candidate for the model selection process. Consequently, the combination of Optuna with CatBoost is deemed the preferred approach for solving the model selection task due to its ability to deliver superior accuracy.
References¶
[1] S. Cortinhas, "Spaceship Titanic: A Complete Guide," Kaggle, https://www.kaggle.com/code/samuelcortinhas/spaceship-titanic-a-complete-guide#EDA (accessed Mar. 28, 2024).
[2] GfG, “ML Handling Missing Values,” GeeksforGeeks, May 04, 2018. Link (Accessed Mar. 31, 2024).
[3] GfG, “Python Imputation using the KNNimputer(),” GeeksforGeeks, Aug. 10, 2020. Link (Accessed Apr. 02, 2024).
[4] GfG, “Feature Engineering Scaling, Normalization, and Standardization,” GeeksforGeeks, Jul. 02, 2018. Link (Accessed Apr. 01, 2024).
[5] GfG, “What is StandardScaler?,” GeeksforGeeks, Feb. 09, 2024. Link (Accessed Apr. 01, 2024).
[6] Potdar, K., Pardawala, T., and Pai, C. “A Comparative Study of Categorical Variable Encoding Techniques for Neural Network Classifiers,” IJCA, vol. 175, no. 4, pp. 7–9, Oct. 2017, doi: 10.5120/ijca2017915495.
[7] M, Sujatha., “Why we have to remove highly correlated features in Machine Learning?,” Medium, Nov. 29, 2023. Link (Accessed Apr. 02, 2024).
[8] H, Kalkan, "SpaceshipTitanic : XGBClassifier+OPTUNA," Kaggle, https://www.kaggle.com/code/hseyinkalkan/spaceshiptitanic-xgbclassifier-optuna (accessed Mar. 28, 2024).
[9] GeeksforGeeks, "Cross Validation in Machine Learning," GeeksforGeeks, Link (Accessed Apr. 02, 2024)
[10] V. Chugh, “LGBMClassifier: A Getting Started Guide,” KDnuggets, Jul. 29, 2023. Link (Accessed Apr. 02, 2024).
[11] GfG, “LightGBM (Light Gradient Boosting Machine),” GeeksforGeeks, Jul. 15, 2020. Link (Accessed Apr. 02, 2024).
[12] T. G. Mesevage, “Machine Learning Classifiers - The Algorithms & How They Work,” MonkeyLearn Blog, Dec. 14, 2020. Link (Accessed Apr. 02, 2024).
[13] S. Asiri, “Classification in Machine Learning: An Introduction | Built In,” builtin.com, Nov. 15, 2022. Link (Accessed Apr. 02, 2024).
[14] A. Mondal, “LightGBM in Python | Complete guide on how to Use LightGBM in Python,” Analytics Vidhya, Aug. 18, 2021. Link (Accessed Apr. 02, 2024).
[15] T. Aggarwal, “Empower Your Machine Learning Models with LightGBM: A Step-by-Step Guide,” Medium, Aug. 07, 2023. Link (Accessed Apr. 02, 2024).
[16] J. Brownlee, “Recursive Feature Elimination (RFE) for Feature Selection in Python,” Machine Learning Mastery, May 24, 2020. Link (Accessed Apr. 02, 2024).
[17] Avcontentteam, “Recursive Feature Elimination: Working, Advantages & Examples,” Analytics Vidhya, May 17, 2023. Link (Accessed Apr. 02, 2024).
[18] scikit-learn, “sklearn.feature_selection.RFE,” , 2024, Link (Accessed Apr. 02, 2024).
[19] K. Li, “Model tuning and what is it ?(using python),” Medium, Link (Accessed Apr. 1, 2024).
[20] D. David, “Hyperparameter optimization techniques to improve your machine learning model’s performance,” freeCodeCamp.org, Link (Accessed Apr. 4, 2024).
[21] R. Joseph, "Grid Search for model tuning", Medium, Link (Accessed: Apr. 4, 2024).
[22] GeeksforGeeks, "Understanding Logistic Regression," GeeksforGeeks. Link (Accessed: Apr. 4, 2024).
[23] Towards Data Science, "Logistic Regression Explained," Towards Data Science. Link (Accessed: Apr. 4, 2024).
[24] OpenGenus IQ, "Advantages and Disadvantages of Logistic Regression," OpenGenus IQ. Link (Accessed: Apr. 4, 2024).
[25] Medium, "Do I need to tune logistic regression hyperparameters?," Medium. Link (Accessed: Apr. 4, 2024).
[26] scikit-learn, "Linear Models (scikit-learn Documentation)," scikit-learn. Link (Accessed: Apr. 4, 2024).
[27] M. Chaudhary, “Random Forest Algorithm - How It Works and Why It Is So Effective,” Link (Accessed: Apr. 4, 2024).
[28] GfG, “Random Forest Algorithm in Machine Learning,” GeeksforGeeks, Feb. 22, 2024. Link (Accessed: Apr. 4, 2024).
[29] Scikit-learn, “sklearn.ensemble.RandomForestClassifier — scikit-learn 0.20.3 documentation,” 2018. Link (Accessed: Apr. 4, 2024).
[30] N. Kumar, “Naive Bayes Classifiers - GeeksforGeeks,” GeeksforGeeks, Jan. 14, 2019. [Link](https://www.geeksforgeeks.org/naive-bayes-classifiers/ (Accessed Apr. 02, 2024).
[31] IBM, “What Are Naïve Bayes Classifiers? | IBM,” www.ibm.com. https://www.ibm.com/topics/naive-bayes#:~:text=Na%C3%AFve%20Bayes%20is%20part%20of) (Accessed Apr. 02, 2024).
[32] C. Martins, “Gaussian Naive Bayes Explained With Scikit-Learn | Built In,” builtin.com, Nov. 02, 2023. Link (Accessed Apr. 02, 2024).
[33] GfG, “Gaussian Naive Bayes,” GeeksforGeeks, Nov. 13, 2023. Link (Accessed Apr. 02, 2024).
[34] Scikit-learn, “sklearn.naive_bayes.GaussianNB — scikit-learn 0.22.1 documentation,” scikit-learn.org. Link (Accessed Apr. 02, 2024).
[35] K. Jain, “How to Improve Naive Bayes?,” Analytics Vidhya, Apr. 03, 2021. Link (Accessed Apr. 02, 2024).
[36] GfG, “Bernoulli Naive Bayes,” GeeksforGeeks, Oct. 25, 2023. Link (Accessed Apr. 02, 2024).
[37] N. Mutha, “Bernoulli Naive Bayes,” OpenGenus IQ: Learn Computer Science, May 30, 2020. Link (Accessed Apr. 02, 2024).
[38] GeeksforGeeks, "XGBoost," GeeksforGeeks. Link (Accessed: Apr. 6, 2024).
[39] Prashant Kumar, "A Guide on XGBoost Hyperparameters Tuning," Kaggle. Link (Accessed: Apr. 6, 2024).
[40] TrainTestSplit, "XGBoost Hyperparameter Tuning," TrainTestSplit. Link (Accessed: Apr. 6, 2024).
[41] GfG, “CatBoost in Machine Learning,” GeeksforGeeks, Jan. 20, 2021. Link [Accessed 3 Apr. 2024].
[42] ESRI (2024). How CatBoost algorithm works. Link [Accessed 3 Apr. 2024].
[43] Nik, “Python optuna: A guide to hyperparameter optimization • datagy,” datagy. Link (Accessed Apr. 1, 2024).
[44] A. Shahrour, “Optuna vs GridSearch,” Medium. Link (Accessed Apr. 1, 2024).
[45] B. Baldé, “Bayesian sorcery for hyperparameter optimization using optuna,” Medium. Link (Accessed Apr. 1, 2024).
[46] J. Brownlee, “Random search and grid search for function optimization,” MachineLearningMastery.com, Link (Accessed Apr. 5, 2024).
[47] B. Gupta, “Random search in machine learning,” Scaler Topics, Link (Accessed Apr. 5, 2024).